Apr 17 17:26:23.196018 ip-10-0-138-42 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:26:23.662025 ip-10-0-138-42 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:26:23.662025 ip-10-0-138-42 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:26:23.662025 ip-10-0-138-42 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:26:23.662025 ip-10-0-138-42 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:26:23.662025 ip-10-0-138-42 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:26:23.664463 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.664378 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:26:23.668621 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668594 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:26:23.668621 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668616 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:26:23.668621 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668621 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:26:23.668621 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668624 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:26:23.668621 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668627 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:26:23.668621 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668630 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668633 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668636 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668639 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668641 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668644 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668648 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668651 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668653 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668656 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668659 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668662 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668664 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668667 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668669 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668672 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668674 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668677 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668680 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668683 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:26:23.668856 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668685 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668688 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668691 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668693 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668714 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668717 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668720 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668723 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668728 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668732 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668735 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668738 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668741 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668744 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668747 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668750 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668753 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668756 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668759 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:26:23.669331 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668762 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668764 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668767 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668770 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668772 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668776 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668778 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668781 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668783 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668787 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668790 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668793 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668796 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668799 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668801 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668804 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668806 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668809 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668812 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:26:23.669824 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668815 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668817 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668820 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668823 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668825 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668828 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668831 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668834 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668836 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668839 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668842 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668845 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668848 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668851 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668853 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668856 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668859 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668862 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668865 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668868 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:26:23.670287 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668870 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668873 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.668876 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669313 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669320 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669323 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669326 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669329 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669332 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669335 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669338 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669341 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669344 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669347 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669349 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669352 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669354 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669357 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669360 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:26:23.670772 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669363 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669366 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669369 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669372 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669374 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669377 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669379 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669382 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669385 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669388 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669390 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669393 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669395 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669398 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669400 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669403 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669405 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669408 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669410 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669413 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:26:23.671251 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669415 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669419 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669421 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669424 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669426 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669429 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669431 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669434 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669436 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669439 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669441 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669443 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669446 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669449 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669451 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669454 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669456 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669459 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669462 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669464 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:26:23.671773 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669467 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669470 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669472 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669475 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669477 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669480 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669482 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669485 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669487 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669490 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669492 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669494 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669499 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669503 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669506 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669508 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669511 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669515 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669517 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669520 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:26:23.672258 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669522 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669525 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669527 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669530 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669532 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669536 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669538 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669541 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669544 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.669546 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669629 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669649 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669658 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669668 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669674 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669677 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669681 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669686 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669689 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669692 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669710 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:26:23.672756 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669714 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669717 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669720 2571 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669723 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669726 2571 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669729 2571 flags.go:64] FLAG: --cloud-config="" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669732 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669735 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669741 2571 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669744 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669747 2571 flags.go:64] FLAG: --config-dir="" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669750 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669753 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669757 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669760 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669764 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669767 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669770 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669773 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669777 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669780 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669783 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669787 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669790 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669793 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:26:23.673266 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669796 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669799 2571 flags.go:64] FLAG: --enable-server="true" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669802 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669807 2571 flags.go:64] FLAG: --event-burst="100" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669810 2571 flags.go:64] FLAG: --event-qps="50" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669813 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669817 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669820 2571 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669823 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669826 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669829 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669833 2571 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669836 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669839 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669842 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669846 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669849 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669852 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669855 2571 flags.go:64] FLAG: --feature-gates="" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669859 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669862 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669865 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669868 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669871 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669874 2571 flags.go:64] FLAG: --help="false" Apr 17 17:26:23.673893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669877 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-138-42.ec2.internal" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669880 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669883 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669886 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669890 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669893 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669896 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669899 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669902 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669905 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669908 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669912 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669915 2571 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669918 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669921 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669924 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669927 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669930 2571 flags.go:64] FLAG: --lock-file="" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669932 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669935 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669938 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669944 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669947 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669950 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:26:23.674490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669953 2571 flags.go:64] FLAG: --logging-format="text" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669956 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669959 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669962 2571 flags.go:64] FLAG: --manifest-url="" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669965 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669969 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669972 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669976 2571 flags.go:64] FLAG: --max-pods="110" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669979 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669982 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669985 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669988 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669991 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669995 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.669998 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670005 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670008 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670012 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670015 2571 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670018 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670024 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670027 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670030 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670034 2571 flags.go:64] FLAG: --port="10250" Apr 17 17:26:23.675164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670037 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670039 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-097fe430a62ebfaf6" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670043 2571 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670046 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670049 2571 flags.go:64] FLAG: --register-node="true" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670051 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670055 2571 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670059 2571 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670062 2571 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670064 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670067 2571 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670071 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670074 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670077 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670080 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670083 2571 flags.go:64] FLAG: --runonce="false" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670086 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670089 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670092 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670095 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670098 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670101 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670104 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670107 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670110 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670113 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:26:23.675783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670116 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670124 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670127 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670130 2571 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670139 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670144 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670147 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670150 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670155 2571 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670157 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670160 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670163 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670166 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670169 2571 flags.go:64] FLAG: --v="2" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670174 2571 flags.go:64] FLAG: --version="false" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670178 2571 flags.go:64] FLAG: --vmodule="" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670182 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670185 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670287 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670291 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670294 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670297 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670300 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670303 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:26:23.676435 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670306 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670308 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670311 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670314 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670316 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670319 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670322 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670324 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670327 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670331 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670334 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670336 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670340 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670343 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670345 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670348 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670351 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670353 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670356 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670359 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:26:23.677016 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670361 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670364 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670367 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670369 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670372 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670374 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670377 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670379 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670382 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670385 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670387 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670390 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670392 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670395 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670397 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670400 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670402 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670405 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670407 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670410 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:26:23.677521 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670414 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670419 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670424 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670428 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670432 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670435 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670438 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670440 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670443 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670445 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670447 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670450 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670453 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670455 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670458 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670460 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670463 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670465 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670468 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:26:23.678029 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670470 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670473 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670475 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670478 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670480 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670482 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670485 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670488 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670490 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670493 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670496 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670498 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670501 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670503 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670507 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670510 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670512 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670516 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670519 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:26:23.678510 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670521 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:26:23.678996 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.670524 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:26:23.678996 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.670530 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:26:23.678996 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.677517 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:26:23.678996 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.677544 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:26:23.678996 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677607 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:26:23.678996 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677612 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:26:23.678996 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677615 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:26:23.678996 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677621 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:26:23.678996 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677626 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:26:23.678996 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677630 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:26:23.678996 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677634 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:26:23.678996 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677636 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:26:23.678996 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677639 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:26:23.678996 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677642 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:26:23.678996 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677645 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677648 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677651 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677653 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677656 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677658 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677661 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677664 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677666 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677669 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677672 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677674 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677677 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677680 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677682 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677686 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677688 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677691 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677693 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677710 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:26:23.679386 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677713 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677716 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677719 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677722 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677725 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677727 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677730 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677733 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677736 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677738 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677741 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677744 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677746 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677749 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677752 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677754 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677757 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677759 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677761 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:26:23.679997 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677764 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677766 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677769 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677772 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677774 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677777 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677780 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677782 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677785 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677788 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677790 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677793 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677796 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677799 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677802 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677804 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677807 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677810 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677812 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677815 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:26:23.680453 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677818 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:26:23.680978 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677820 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:26:23.680978 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677823 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:26:23.680978 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677825 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:26:23.680978 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677828 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:26:23.680978 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677830 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:26:23.680978 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677833 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:26:23.680978 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677835 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:26:23.680978 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677838 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:26:23.680978 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677840 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:26:23.680978 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677843 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:26:23.680978 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677847 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:26:23.680978 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677850 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:26:23.680978 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677853 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:26:23.680978 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677856 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:26:23.680978 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677859 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:26:23.680978 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677862 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:26:23.681366 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.677867 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:26:23.681366 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677979 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:26:23.681366 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677984 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:26:23.681366 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677987 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:26:23.681366 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677990 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:26:23.681366 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677993 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:26:23.681366 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677995 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:26:23.681366 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.677999 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:26:23.681366 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678001 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:26:23.681366 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678005 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:26:23.681366 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678008 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:26:23.681366 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678011 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:26:23.681366 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678013 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:26:23.681366 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678016 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:26:23.681366 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678018 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678022 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678027 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678032 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678035 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678039 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678042 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678045 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678048 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678051 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678053 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678056 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678059 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678061 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678064 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678066 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678069 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678071 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678074 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:26:23.681745 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678076 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678079 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678081 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678084 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678086 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678089 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678092 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678094 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678097 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678100 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678103 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678105 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678108 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678111 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678113 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678116 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678118 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678121 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678123 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678126 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:26:23.682206 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678128 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678131 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678134 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678136 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678139 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678141 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678144 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678146 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678149 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678151 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678153 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678156 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678158 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678161 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678163 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678166 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678168 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678171 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678174 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678176 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:26:23.682681 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678179 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:26:23.683188 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678181 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:26:23.683188 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678184 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:26:23.683188 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678186 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:26:23.683188 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678189 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:26:23.683188 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678191 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:26:23.683188 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678194 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:26:23.683188 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678196 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:26:23.683188 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678206 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:26:23.683188 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678209 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:26:23.683188 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678212 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:26:23.683188 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678214 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:26:23.683188 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678217 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:26:23.683188 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:23.678220 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:26:23.683188 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.678224 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:26:23.683188 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.678896 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:26:23.683554 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.680923 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:26:23.683554 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.682121 2571 server.go:1019] "Starting client certificate rotation" Apr 17 17:26:23.683554 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.682222 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:26:23.683554 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.683185 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:26:23.710457 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.710435 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:26:23.713870 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.713853 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:26:23.729894 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.729871 2571 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:26:23.735780 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.735762 2571 log.go:25] "Validated CRI v1 image API" Apr 17 17:26:23.738396 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.738375 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:26:23.742489 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.742469 2571 fs.go:135] Filesystem UUIDs: map[07604135-264f-419e-b28e-b822e0920b65:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 91678cd4-ed60-4777-9503-6b34f6f76983:/dev/nvme0n1p3] Apr 17 17:26:23.742547 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.742490 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:26:23.748353 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.748249 2571 manager.go:217] Machine: {Timestamp:2026-04-17 17:26:23.746259398 +0000 UTC m=+0.420112724 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099616 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec201c815592b01cac6c0538ac8ec0e6 SystemUUID:ec201c81-5592-b01c-ac6c-0538ac8ec0e6 BootID:39fea56e-234a-4f77-a320-a17bbde41666 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:35:81:93:10:af Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:35:81:93:10:af Speed:0 Mtu:9001} {Name:ovs-system MacAddress:22:6c:1b:bc:b8:e1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:26:23.748353 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.748348 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:26:23.748462 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.748431 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:26:23.750414 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.750388 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:26:23.750552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.750417 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-42.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:26:23.750599 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.750562 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:26:23.750599 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.750571 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:26:23.750599 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.750588 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:26:23.750676 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.750601 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:26:23.751482 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.751472 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:26:23.751584 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.751575 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:26:23.755057 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.755047 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:26:23.755099 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.755061 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:26:23.755099 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.755079 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:26:23.755099 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.755088 2571 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:26:23.755099 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.755097 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:26:23.756129 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.756118 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:26:23.756174 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.756136 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:26:23.759452 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.759430 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:26:23.760909 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.760891 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:26:23.761475 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.761450 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:26:23.763047 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.763035 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:26:23.763097 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.763056 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:26:23.763097 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.763068 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:26:23.763097 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.763076 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:26:23.763097 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.763082 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:26:23.763097 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.763087 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:26:23.763097 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.763093 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:26:23.763097 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.763099 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:26:23.763297 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.763107 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:26:23.763297 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.763114 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:26:23.763297 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.763122 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:26:23.763297 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.763131 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:26:23.764982 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.764968 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:26:23.764982 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.764983 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:26:23.768620 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.768606 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:26:23.768694 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.768646 2571 server.go:1295] "Started kubelet" Apr 17 17:26:23.768776 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.768728 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:26:23.768900 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.768838 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:26:23.768961 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.768943 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:26:23.769456 ip-10-0-138-42 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:26:23.769596 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:23.769450 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-42.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 17:26:23.769664 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.769600 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-42.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 17:26:23.769730 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:23.769676 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 17:26:23.770144 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.770056 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:26:23.771672 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.771658 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:26:23.776745 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.776728 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:26:23.776927 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:23.774774 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-42.ec2.internal.18a734f014e9ebc1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-42.ec2.internal,UID:ip-10-0-138-42.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-42.ec2.internal,},FirstTimestamp:2026-04-17 17:26:23.768619969 +0000 UTC m=+0.442473296,LastTimestamp:2026-04-17 17:26:23.768619969 +0000 UTC m=+0.442473296,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-42.ec2.internal,}" Apr 17 17:26:23.777418 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.777403 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:26:23.778592 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.778573 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:26:23.778592 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.778592 2571 factory.go:55] Registering systemd factory Apr 17 17:26:23.778748 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.778602 2571 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:26:23.778748 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.778650 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:26:23.778748 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.778682 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:26:23.778748 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:23.778683 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-42.ec2.internal\" not found" Apr 17 17:26:23.778748 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.778733 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:26:23.778948 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.778867 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:26:23.778948 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.778878 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:26:23.778948 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.778900 2571 factory.go:153] Registering CRI-O factory Apr 17 17:26:23.778948 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.778917 2571 factory.go:223] Registration of the crio container factory successfully Apr 17 17:26:23.778948 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.778943 2571 factory.go:103] Registering Raw factory Apr 17 17:26:23.779126 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.778957 2571 manager.go:1196] Started watching for new ooms in manager Apr 17 17:26:23.779515 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.779499 2571 manager.go:319] Starting recovery of all containers Apr 17 17:26:23.779979 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:23.779952 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-42.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 17:26:23.780111 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:23.779959 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 17:26:23.780194 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:23.780080 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:26:23.789020 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.788864 2571 manager.go:324] Recovery completed Apr 17 17:26:23.793416 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.793343 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:23.796194 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.796176 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:23.796268 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.796211 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:23.796268 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.796227 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:23.796824 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.796808 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:26:23.796824 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.796821 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:26:23.796930 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.796839 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:26:23.799046 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.799033 2571 policy_none.go:49] "None policy: Start" Apr 17 17:26:23.799106 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.799050 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:26:23.799106 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.799060 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:26:23.807733 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:23.807642 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-42.ec2.internal.18a734f0168eaaf3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-42.ec2.internal,UID:ip-10-0-138-42.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-42.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-42.ec2.internal,},FirstTimestamp:2026-04-17 17:26:23.796194035 +0000 UTC m=+0.470047366,LastTimestamp:2026-04-17 17:26:23.796194035 +0000 UTC m=+0.470047366,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-42.ec2.internal,}" Apr 17 17:26:23.815959 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:23.815885 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-42.ec2.internal.18a734f0168f0a55 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-42.ec2.internal,UID:ip-10-0-138-42.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-138-42.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-138-42.ec2.internal,},FirstTimestamp:2026-04-17 17:26:23.796218453 +0000 UTC m=+0.470071788,LastTimestamp:2026-04-17 17:26:23.796218453 +0000 UTC m=+0.470071788,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-42.ec2.internal,}" Apr 17 17:26:23.824957 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:23.824874 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-42.ec2.internal.18a734f0168f4350 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-42.ec2.internal,UID:ip-10-0-138-42.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-138-42.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-138-42.ec2.internal,},FirstTimestamp:2026-04-17 17:26:23.79623304 +0000 UTC m=+0.470086369,LastTimestamp:2026-04-17 17:26:23.79623304 +0000 UTC m=+0.470086369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-42.ec2.internal,}" Apr 17 17:26:23.839283 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.839268 2571 manager.go:341] "Starting Device Plugin manager" Apr 17 17:26:23.848341 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:23.839305 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:26:23.848341 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.839318 2571 server.go:85] "Starting device plugin registration server" Apr 17 17:26:23.848341 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.839561 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:26:23.848341 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.839575 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:26:23.848341 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.839735 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:26:23.848341 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.839810 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:26:23.848341 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.839818 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:26:23.848341 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:23.840284 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:26:23.848341 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:23.840324 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-42.ec2.internal\" not found" Apr 17 17:26:23.849926 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:23.849868 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-42.ec2.internal.18a734f0194068e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-42.ec2.internal,UID:ip-10-0-138-42.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-138-42.ec2.internal,},FirstTimestamp:2026-04-17 17:26:23.841396963 +0000 UTC m=+0.515250280,LastTimestamp:2026-04-17 17:26:23.841396963 +0000 UTC m=+0.515250280,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-42.ec2.internal,}" Apr 17 17:26:23.862267 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.862249 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jvgzr" Apr 17 17:26:23.870206 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.870185 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jvgzr" Apr 17 17:26:23.910392 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.910349 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:26:23.911550 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.911531 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:26:23.911660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.911557 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:26:23.911660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.911580 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:26:23.911660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.911585 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:26:23.911660 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:23.911621 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:26:23.922271 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.922216 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:23.940313 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.940287 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:23.942090 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.942074 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:23.942169 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.942101 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:23.942169 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.942111 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:23.942169 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.942136 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-42.ec2.internal" Apr 17 17:26:23.951965 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:23.951944 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-42.ec2.internal" Apr 17 17:26:23.952034 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:23.951968 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-42.ec2.internal\": node \"ip-10-0-138-42.ec2.internal\" not found" Apr 17 17:26:23.970685 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:23.970660 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-42.ec2.internal\" not found" Apr 17 17:26:24.012122 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.012068 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-42.ec2.internal"] Apr 17 17:26:24.012272 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.012176 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:24.013163 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.013147 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:24.013256 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.013181 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:24.013256 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.013196 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:24.014521 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.014505 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:24.014656 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.014642 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal" Apr 17 17:26:24.014722 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.014672 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:24.015368 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.015353 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:24.015419 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.015365 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:24.015419 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.015383 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:24.015419 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.015387 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:24.015419 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.015398 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:24.015419 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.015405 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:24.016463 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.016446 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-42.ec2.internal" Apr 17 17:26:24.016554 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.016469 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:24.017126 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.017112 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:24.017203 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.017133 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:24.017203 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.017145 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:24.045278 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:24.045255 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-42.ec2.internal\" not found" node="ip-10-0-138-42.ec2.internal" Apr 17 17:26:24.049501 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:24.049486 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-42.ec2.internal\" not found" node="ip-10-0-138-42.ec2.internal" Apr 17 17:26:24.071774 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:24.071752 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-42.ec2.internal\" not found" Apr 17 17:26:24.080475 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.080458 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/12ee1495c5f896624a48870a38fd93a3-config\") pod \"kube-apiserver-proxy-ip-10-0-138-42.ec2.internal\" (UID: \"12ee1495c5f896624a48870a38fd93a3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-42.ec2.internal" Apr 17 17:26:24.080534 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.080483 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ddb715ec7aeed1631a73df97923d1472-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal\" (UID: \"ddb715ec7aeed1631a73df97923d1472\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal" Apr 17 17:26:24.080534 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.080500 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddb715ec7aeed1631a73df97923d1472-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal\" (UID: \"ddb715ec7aeed1631a73df97923d1472\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal" Apr 17 17:26:24.172090 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:24.172058 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-42.ec2.internal\" not found" Apr 17 17:26:24.181556 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.181499 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ddb715ec7aeed1631a73df97923d1472-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal\" (UID: \"ddb715ec7aeed1631a73df97923d1472\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal" Apr 17 17:26:24.181556 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.181531 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddb715ec7aeed1631a73df97923d1472-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal\" (UID: \"ddb715ec7aeed1631a73df97923d1472\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal" Apr 17 17:26:24.181556 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.181548 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/12ee1495c5f896624a48870a38fd93a3-config\") pod \"kube-apiserver-proxy-ip-10-0-138-42.ec2.internal\" (UID: \"12ee1495c5f896624a48870a38fd93a3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-42.ec2.internal" Apr 17 17:26:24.181687 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.181598 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ddb715ec7aeed1631a73df97923d1472-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal\" (UID: \"ddb715ec7aeed1631a73df97923d1472\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal" Apr 17 17:26:24.181687 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.181666 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddb715ec7aeed1631a73df97923d1472-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal\" (UID: \"ddb715ec7aeed1631a73df97923d1472\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal" Apr 17 17:26:24.181770 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.181716 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/12ee1495c5f896624a48870a38fd93a3-config\") pod \"kube-apiserver-proxy-ip-10-0-138-42.ec2.internal\" (UID: \"12ee1495c5f896624a48870a38fd93a3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-42.ec2.internal" Apr 17 17:26:24.272940 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:24.272893 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-42.ec2.internal\" not found" Apr 17 17:26:24.347485 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.347446 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal" Apr 17 17:26:24.352013 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.351996 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-42.ec2.internal" Apr 17 17:26:24.373479 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:24.373451 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-42.ec2.internal\" not found" Apr 17 17:26:24.474139 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:24.474068 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-42.ec2.internal\" not found" Apr 17 17:26:24.574676 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:24.574648 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-42.ec2.internal\" not found" Apr 17 17:26:24.619688 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.619663 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:24.675185 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:24.675156 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-42.ec2.internal\" not found" Apr 17 17:26:24.682517 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.682496 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:26:24.682662 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.682645 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:26:24.682744 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.682663 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:26:24.775357 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:24.775294 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-42.ec2.internal\" not found" Apr 17 17:26:24.777462 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.777442 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:26:24.788386 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.788366 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:26:24.792188 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:24.792164 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddb715ec7aeed1631a73df97923d1472.slice/crio-5dd6086e49aedf17d99109c200bb9826d5e89c969e5836cc7d312815ba260b47 WatchSource:0}: Error finding container 5dd6086e49aedf17d99109c200bb9826d5e89c969e5836cc7d312815ba260b47: Status 404 returned error can't find the container with id 5dd6086e49aedf17d99109c200bb9826d5e89c969e5836cc7d312815ba260b47 Apr 17 17:26:24.792657 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:24.792647 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12ee1495c5f896624a48870a38fd93a3.slice/crio-6e67f6ddafa57f6abed83d67a7a780fd7b99a615975501fd70f3814ee6f36ecf WatchSource:0}: Error finding container 6e67f6ddafa57f6abed83d67a7a780fd7b99a615975501fd70f3814ee6f36ecf: Status 404 returned error can't find the container with id 6e67f6ddafa57f6abed83d67a7a780fd7b99a615975501fd70f3814ee6f36ecf Apr 17 17:26:24.796468 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.796454 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:26:24.810332 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.810312 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-7brlk" Apr 17 17:26:24.818265 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.818246 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-7brlk" Apr 17 17:26:24.872890 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.872858 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:21:23 +0000 UTC" deadline="2027-10-13 16:29:18.743250492 +0000 UTC" Apr 17 17:26:24.872960 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.872890 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13055h2m53.870368213s" Apr 17 17:26:24.876021 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:24.876005 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-42.ec2.internal\" not found" Apr 17 17:26:24.914259 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.914206 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal" event={"ID":"ddb715ec7aeed1631a73df97923d1472","Type":"ContainerStarted","Data":"5dd6086e49aedf17d99109c200bb9826d5e89c969e5836cc7d312815ba260b47"} Apr 17 17:26:24.915118 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:24.915092 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-42.ec2.internal" event={"ID":"12ee1495c5f896624a48870a38fd93a3","Type":"ContainerStarted","Data":"6e67f6ddafa57f6abed83d67a7a780fd7b99a615975501fd70f3814ee6f36ecf"} Apr 17 17:26:24.976304 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:24.976280 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-42.ec2.internal\" not found" Apr 17 17:26:25.076920 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:25.076853 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-42.ec2.internal\" not found" Apr 17 17:26:25.102733 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.102693 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:25.177776 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:25.177742 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-42.ec2.internal\" not found" Apr 17 17:26:25.182720 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.182680 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:25.278721 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.278488 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal" Apr 17 17:26:25.290726 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.290457 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:26:25.290726 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.290569 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-42.ec2.internal" Apr 17 17:26:25.299237 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.299215 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:26:25.757171 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.757138 2571 apiserver.go:52] "Watching apiserver" Apr 17 17:26:25.764947 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.764922 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:26:25.765323 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.765298 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-zdmch","openshift-multus/network-metrics-daemon-fczvt","openshift-network-diagnostics/network-check-target-7m8gw","openshift-ovn-kubernetes/ovnkube-node-nz6mf","kube-system/kube-apiserver-proxy-ip-10-0-138-42.ec2.internal","openshift-cluster-node-tuning-operator/tuned-7qhwz","openshift-image-registry/node-ca-rfmdv","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal","openshift-multus/multus-additional-cni-plugins-qdxz9","openshift-multus/multus-glbqg","openshift-network-operator/iptables-alerter-zpkb4","kube-system/konnectivity-agent-nd4xv","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx"] Apr 17 17:26:25.766683 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.766667 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.767998 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.767973 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.769983 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.769584 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nd4xv" Apr 17 17:26:25.769983 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.769826 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-d6cjm\"" Apr 17 17:26:25.770458 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.770436 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:26:25.771186 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.771025 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.771630 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.771100 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:26:25.771787 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.771117 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:26:25.772893 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.772877 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:25.772987 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:25.772951 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:26:25.774207 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.774180 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:25.774295 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:25.774247 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7m8gw" podUID="dc54b8b2-8ef6-49b7-afa2-35c2acaae914" Apr 17 17:26:25.775533 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.775516 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.776126 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.776091 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:26:25.776521 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.776110 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:26:25.776521 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.776433 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:26:25.776686 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.776543 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-phpj9\"" Apr 17 17:26:25.776686 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.776413 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:26:25.776686 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.776573 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:26:25.776857 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.776686 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7s5z8\"" Apr 17 17:26:25.776857 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.776803 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:26:25.776857 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.776804 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:26:25.777004 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.776857 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:26:25.777004 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.776897 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:26:25.777004 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.776913 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:26:25.777004 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.776988 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:26:25.777172 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.777124 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:26:25.777463 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.777444 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5p7zn\"" Apr 17 17:26:25.778060 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.778041 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rfmdv" Apr 17 17:26:25.778153 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.778137 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.779347 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.779330 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zdmch" Apr 17 17:26:25.780638 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.780618 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zpkb4" Apr 17 17:26:25.787924 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.787905 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:26:25.787924 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.787917 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:26:25.788067 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.787928 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:26:25.788067 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.787924 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:26:25.788169 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.788077 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-862sf\"" Apr 17 17:26:25.788169 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.788088 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:26:25.788169 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.788105 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:26:25.788169 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.788117 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:26:25.788403 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.788221 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-kltqm\"" Apr 17 17:26:25.788403 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.788396 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-xk74t\"" Apr 17 17:26:25.788557 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.788542 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-np6vd\"" Apr 17 17:26:25.788646 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.788627 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:26:25.788646 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.788640 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:26:25.788778 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.788739 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:26:25.788991 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.788972 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ljwph\"" Apr 17 17:26:25.788991 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.788986 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:26:25.789112 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.789004 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:26:25.790252 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.790230 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-hostroot\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.790355 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.790284 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sb7l\" (UniqueName: \"kubernetes.io/projected/8f82208d-ba82-434a-a209-d69847b4e54b-kube-api-access-4sb7l\") pod \"network-metrics-daemon-fczvt\" (UID: \"8f82208d-ba82-434a-a209-d69847b4e54b\") " pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:25.790445 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.790314 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3c27294e-af6d-4679-a6cf-cd522ae4d31e-tmp\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.790507 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.790398 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-run-openvswitch\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.790565 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.790525 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-node-log\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.790614 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.790595 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-kubernetes\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.790664 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.790627 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-lib-modules\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.790728 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.790673 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-host\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.790728 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.790718 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-tuned\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.790852 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.790745 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-cnibin\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.790852 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.790768 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-multus-socket-dir-parent\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.790852 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.790790 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-kubelet\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.790852 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.790812 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05f5f413-d8bb-4636-a351-67da614740dd-ovnkube-config\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.791066 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.790850 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfhhw\" (UniqueName: \"kubernetes.io/projected/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-kube-api-access-zfhhw\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.791066 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.790901 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/09957151-db22-4dfb-9042-61ea1fff6d0b-serviceca\") pod \"node-ca-rfmdv\" (UID: \"09957151-db22-4dfb-9042-61ea1fff6d0b\") " pod="openshift-image-registry/node-ca-rfmdv" Apr 17 17:26:25.791066 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.790929 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-sysctl-d\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.791066 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.790979 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-run-ovn\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.791066 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791006 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a0b85ad2-2ebd-48c9-8952-e5d89d308e59-konnectivity-ca\") pod \"konnectivity-agent-nd4xv\" (UID: \"a0b85ad2-2ebd-48c9-8952-e5d89d308e59\") " pod="kube-system/konnectivity-agent-nd4xv" Apr 17 17:26:25.791066 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791050 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-sys-fs\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.791351 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791093 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kr4d\" (UniqueName: \"kubernetes.io/projected/09957151-db22-4dfb-9042-61ea1fff6d0b-kube-api-access-6kr4d\") pod \"node-ca-rfmdv\" (UID: \"09957151-db22-4dfb-9042-61ea1fff6d0b\") " pod="openshift-image-registry/node-ca-rfmdv" Apr 17 17:26:25.791351 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791152 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-sysctl-conf\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.791351 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791177 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d2765e9-81b0-417d-9da1-edd5712b0fc4-system-cni-dir\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.791351 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791203 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5ac259a-b271-4b9e-9b77-a8a0b4118c19-hosts-file\") pod \"node-resolver-zdmch\" (UID: \"e5ac259a-b271-4b9e-9b77-a8a0b4118c19\") " pod="openshift-dns/node-resolver-zdmch" Apr 17 17:26:25.791351 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791228 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-etc-kubernetes\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.791351 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791250 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-run-systemd\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.791351 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791292 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-run-k8s-cni-cncf-io\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.791351 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791327 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp7mp\" (UniqueName: \"kubernetes.io/projected/a9fa17fb-8628-4f11-aa6a-a17be56e125d-kube-api-access-gp7mp\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.791654 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791365 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/05f5f413-d8bb-4636-a351-67da614740dd-ovnkube-script-lib\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.791654 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791389 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9fa17fb-8628-4f11-aa6a-a17be56e125d-cni-binary-copy\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.791654 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791412 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-var-lib-openvswitch\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.791654 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791434 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-modprobe-d\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.791654 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791468 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-var-lib-kubelet\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.791654 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791511 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-systemd-units\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.791654 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791545 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-etc-openvswitch\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.791654 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791583 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-system-cni-dir\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.791654 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791605 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-multus-conf-dir\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.791654 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791629 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-device-dir\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.791654 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791652 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n98cp\" (UniqueName: \"kubernetes.io/projected/3c27294e-af6d-4679-a6cf-cd522ae4d31e-kube-api-access-n98cp\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.792060 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791675 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-run-multus-certs\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.792060 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791721 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.792060 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791747 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a0b85ad2-2ebd-48c9-8952-e5d89d308e59-agent-certs\") pod \"konnectivity-agent-nd4xv\" (UID: \"a0b85ad2-2ebd-48c9-8952-e5d89d308e59\") " pod="kube-system/konnectivity-agent-nd4xv" Apr 17 17:26:25.792060 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791770 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5d2765e9-81b0-417d-9da1-edd5712b0fc4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.792060 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791794 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5d2765e9-81b0-417d-9da1-edd5712b0fc4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.792060 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791817 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5ac259a-b271-4b9e-9b77-a8a0b4118c19-tmp-dir\") pod \"node-resolver-zdmch\" (UID: \"e5ac259a-b271-4b9e-9b77-a8a0b4118c19\") " pod="openshift-dns/node-resolver-zdmch" Apr 17 17:26:25.792060 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791842 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.792060 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791864 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-cni-netd\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.792060 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05f5f413-d8bb-4636-a351-67da614740dd-ovn-node-metrics-cert\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.792060 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791915 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09957151-db22-4dfb-9042-61ea1fff6d0b-host\") pod \"node-ca-rfmdv\" (UID: \"09957151-db22-4dfb-9042-61ea1fff6d0b\") " pod="openshift-image-registry/node-ca-rfmdv" Apr 17 17:26:25.792060 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791936 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-os-release\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.792060 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.791990 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-slash\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.792060 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792027 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-cni-bin\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.792552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792072 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5d2765e9-81b0-417d-9da1-edd5712b0fc4-os-release\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.792552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792126 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-var-lib-cni-multus\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.792552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792153 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-run-netns\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.792552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792177 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05f5f413-d8bb-4636-a351-67da614740dd-env-overrides\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.792552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792227 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc7sn\" (UniqueName: \"kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn\") pod \"network-check-target-7m8gw\" (UID: \"dc54b8b2-8ef6-49b7-afa2-35c2acaae914\") " pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:25.792552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792269 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-var-lib-kubelet\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.792552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792309 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5d2765e9-81b0-417d-9da1-edd5712b0fc4-cni-binary-copy\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.792552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792345 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n574s\" (UniqueName: \"kubernetes.io/projected/e5ac259a-b271-4b9e-9b77-a8a0b4118c19-kube-api-access-n574s\") pod \"node-resolver-zdmch\" (UID: \"e5ac259a-b271-4b9e-9b77-a8a0b4118c19\") " pod="openshift-dns/node-resolver-zdmch" Apr 17 17:26:25.792552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792383 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69t2s\" (UniqueName: \"kubernetes.io/projected/05f5f413-d8bb-4636-a351-67da614740dd-kube-api-access-69t2s\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.792552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792417 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-registration-dir\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.792552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792439 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-etc-selinux\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.792552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792462 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs\") pod \"network-metrics-daemon-fczvt\" (UID: \"8f82208d-ba82-434a-a209-d69847b4e54b\") " pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:25.792552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792484 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5d2765e9-81b0-417d-9da1-edd5712b0fc4-cnibin\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.792552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792510 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5d2765e9-81b0-417d-9da1-edd5712b0fc4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.792552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792533 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a9fa17fb-8628-4f11-aa6a-a17be56e125d-multus-daemon-config\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.792552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792551 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-run-netns\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.793184 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792564 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-log-socket\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.793184 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792578 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.793184 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792592 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-socket-dir\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.793184 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792685 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-sysconfig\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.793184 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792733 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-run\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.793184 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792777 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-sys\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.793184 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792802 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-multus-cni-dir\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.793184 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792826 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-systemd\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.793184 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792886 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfhxz\" (UniqueName: \"kubernetes.io/projected/5d2765e9-81b0-417d-9da1-edd5712b0fc4-kube-api-access-rfhxz\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.793184 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.792926 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-var-lib-cni-bin\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.819194 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.819170 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:21:24 +0000 UTC" deadline="2027-11-30 10:44:23.198448112 +0000 UTC" Apr 17 17:26:25.819194 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.819194 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14201h17m57.379257888s" Apr 17 17:26:25.879846 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.879824 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:26:25.893940 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.893906 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gp7mp\" (UniqueName: \"kubernetes.io/projected/a9fa17fb-8628-4f11-aa6a-a17be56e125d-kube-api-access-gp7mp\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.893940 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.893940 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/05f5f413-d8bb-4636-a351-67da614740dd-ovnkube-script-lib\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.894134 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.893963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9fa17fb-8628-4f11-aa6a-a17be56e125d-cni-binary-copy\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.894134 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894079 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-var-lib-openvswitch\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.894246 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894147 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-var-lib-openvswitch\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.894246 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894188 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-modprobe-d\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.894246 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894232 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-var-lib-kubelet\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.894391 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894292 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-var-lib-kubelet\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.894391 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894301 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-systemd-units\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.894391 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894330 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-etc-openvswitch\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.894391 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894353 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-modprobe-d\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.894391 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894358 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-system-cni-dir\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.894391 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894389 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-systemd-units\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.894660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894403 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-etc-openvswitch\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.894660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894419 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-system-cni-dir\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.894660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894433 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-multus-conf-dir\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.894660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894463 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-device-dir\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.894660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894465 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-multus-conf-dir\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.894660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894498 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n98cp\" (UniqueName: \"kubernetes.io/projected/3c27294e-af6d-4679-a6cf-cd522ae4d31e-kube-api-access-n98cp\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.894660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894502 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-device-dir\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.894660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894529 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps86m\" (UniqueName: \"kubernetes.io/projected/94f7d770-f5f2-429d-8050-8a979402078e-kube-api-access-ps86m\") pod \"iptables-alerter-zpkb4\" (UID: \"94f7d770-f5f2-429d-8050-8a979402078e\") " pod="openshift-network-operator/iptables-alerter-zpkb4" Apr 17 17:26:25.894660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894557 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-run-multus-certs\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.894660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894565 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/05f5f413-d8bb-4636-a351-67da614740dd-ovnkube-script-lib\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.894660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894583 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.894660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894598 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9fa17fb-8628-4f11-aa6a-a17be56e125d-cni-binary-copy\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.894660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894608 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-run-multus-certs\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.894660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894610 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a0b85ad2-2ebd-48c9-8952-e5d89d308e59-agent-certs\") pod \"konnectivity-agent-nd4xv\" (UID: \"a0b85ad2-2ebd-48c9-8952-e5d89d308e59\") " pod="kube-system/konnectivity-agent-nd4xv" Apr 17 17:26:25.894660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894639 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.894660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894646 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5d2765e9-81b0-417d-9da1-edd5712b0fc4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894675 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5d2765e9-81b0-417d-9da1-edd5712b0fc4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894716 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5ac259a-b271-4b9e-9b77-a8a0b4118c19-tmp-dir\") pod \"node-resolver-zdmch\" (UID: \"e5ac259a-b271-4b9e-9b77-a8a0b4118c19\") " pod="openshift-dns/node-resolver-zdmch" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894751 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894776 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-cni-netd\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894811 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05f5f413-d8bb-4636-a351-67da614740dd-ovn-node-metrics-cert\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894833 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09957151-db22-4dfb-9042-61ea1fff6d0b-host\") pod \"node-ca-rfmdv\" (UID: \"09957151-db22-4dfb-9042-61ea1fff6d0b\") " pod="openshift-image-registry/node-ca-rfmdv" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894858 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-os-release\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894881 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-slash\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894883 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894905 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-cni-bin\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894930 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5d2765e9-81b0-417d-9da1-edd5712b0fc4-os-release\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894936 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09957151-db22-4dfb-9042-61ea1fff6d0b-host\") pod \"node-ca-rfmdv\" (UID: \"09957151-db22-4dfb-9042-61ea1fff6d0b\") " pod="openshift-image-registry/node-ca-rfmdv" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894920 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894956 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-var-lib-cni-multus\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.894980 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-run-netns\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895002 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05f5f413-d8bb-4636-a351-67da614740dd-env-overrides\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895003 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-cni-netd\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.895406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895013 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5ac259a-b271-4b9e-9b77-a8a0b4118c19-tmp-dir\") pod \"node-resolver-zdmch\" (UID: \"e5ac259a-b271-4b9e-9b77-a8a0b4118c19\") " pod="openshift-dns/node-resolver-zdmch" Apr 17 17:26:25.896231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895022 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc7sn\" (UniqueName: \"kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn\") pod \"network-check-target-7m8gw\" (UID: \"dc54b8b2-8ef6-49b7-afa2-35c2acaae914\") " pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:25.896231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895078 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-run-netns\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.896231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895081 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5d2765e9-81b0-417d-9da1-edd5712b0fc4-os-release\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.896231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895122 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-var-lib-cni-multus\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.896231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895139 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-os-release\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.896231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895153 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-var-lib-kubelet\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.896231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895182 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-cni-bin\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.896231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895169 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-slash\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.896231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895181 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5d2765e9-81b0-417d-9da1-edd5712b0fc4-cni-binary-copy\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.896231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895223 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n574s\" (UniqueName: \"kubernetes.io/projected/e5ac259a-b271-4b9e-9b77-a8a0b4118c19-kube-api-access-n574s\") pod \"node-resolver-zdmch\" (UID: \"e5ac259a-b271-4b9e-9b77-a8a0b4118c19\") " pod="openshift-dns/node-resolver-zdmch" Apr 17 17:26:25.896231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895236 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-var-lib-kubelet\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.896231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895251 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69t2s\" (UniqueName: \"kubernetes.io/projected/05f5f413-d8bb-4636-a351-67da614740dd-kube-api-access-69t2s\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.896231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895262 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5d2765e9-81b0-417d-9da1-edd5712b0fc4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.896231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895276 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-registration-dir\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.896231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895303 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-etc-selinux\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.896231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895326 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs\") pod \"network-metrics-daemon-fczvt\" (UID: \"8f82208d-ba82-434a-a209-d69847b4e54b\") " pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:25.896231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895350 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5d2765e9-81b0-417d-9da1-edd5712b0fc4-cnibin\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.897083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895374 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5d2765e9-81b0-417d-9da1-edd5712b0fc4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.897083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895376 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-registration-dir\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.897083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895417 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/94f7d770-f5f2-429d-8050-8a979402078e-host-slash\") pod \"iptables-alerter-zpkb4\" (UID: \"94f7d770-f5f2-429d-8050-8a979402078e\") " pod="openshift-network-operator/iptables-alerter-zpkb4" Apr 17 17:26:25.897083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895450 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a9fa17fb-8628-4f11-aa6a-a17be56e125d-multus-daemon-config\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.897083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895477 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-run-netns\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.897083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895503 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-log-socket\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.897083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895504 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5d2765e9-81b0-417d-9da1-edd5712b0fc4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.897083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895530 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-etc-selinux\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.897083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895544 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5d2765e9-81b0-417d-9da1-edd5712b0fc4-cnibin\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.897083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895580 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.897083 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:25.895588 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:25.897083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895593 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-log-socket\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.897083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895605 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-socket-dir\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.897083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895628 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-sysconfig\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.897083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895634 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-run-netns\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.897083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895672 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05f5f413-d8bb-4636-a351-67da614740dd-env-overrides\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.897083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895687 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-sysconfig\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.897886 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:25.895681 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs podName:8f82208d-ba82-434a-a209-d69847b4e54b nodeName:}" failed. No retries permitted until 2026-04-17 17:26:26.395641535 +0000 UTC m=+3.069494870 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs") pod "network-metrics-daemon-fczvt" (UID: "8f82208d-ba82-434a-a209-d69847b4e54b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:25.897886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895770 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5d2765e9-81b0-417d-9da1-edd5712b0fc4-cni-binary-copy\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.897886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895790 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-run\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.897886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895846 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5d2765e9-81b0-417d-9da1-edd5712b0fc4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.897886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895847 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-sys\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.897886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895889 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-run\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.897886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895900 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-multus-cni-dir\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.897886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895915 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-sys\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.897886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895941 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-systemd\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.897886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895963 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.897886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.895968 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rfhxz\" (UniqueName: \"kubernetes.io/projected/5d2765e9-81b0-417d-9da1-edd5712b0fc4-kube-api-access-rfhxz\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.897886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896005 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/94f7d770-f5f2-429d-8050-8a979402078e-iptables-alerter-script\") pod \"iptables-alerter-zpkb4\" (UID: \"94f7d770-f5f2-429d-8050-8a979402078e\") " pod="openshift-network-operator/iptables-alerter-zpkb4" Apr 17 17:26:25.897886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896037 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-var-lib-cni-bin\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.897886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896061 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-hostroot\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.897886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896084 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a9fa17fb-8628-4f11-aa6a-a17be56e125d-multus-daemon-config\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.897886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896095 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-socket-dir\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896085 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sb7l\" (UniqueName: \"kubernetes.io/projected/8f82208d-ba82-434a-a209-d69847b4e54b-kube-api-access-4sb7l\") pod \"network-metrics-daemon-fczvt\" (UID: \"8f82208d-ba82-434a-a209-d69847b4e54b\") " pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896126 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-var-lib-cni-bin\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896138 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3c27294e-af6d-4679-a6cf-cd522ae4d31e-tmp\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896166 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-multus-cni-dir\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896180 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-hostroot\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896186 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-systemd\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896218 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-run-openvswitch\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896242 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-node-log\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-kubernetes\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896291 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-lib-modules\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896314 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-host\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896293 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-run-openvswitch\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896345 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-tuned\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896396 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-cnibin\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896402 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-kubernetes\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896424 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-multus-socket-dir-parent\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896444 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-node-log\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.898582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896454 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-kubelet\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896462 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-cnibin\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896481 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05f5f413-d8bb-4636-a351-67da614740dd-ovnkube-config\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896480 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-lib-modules\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896502 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-host\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896512 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-host-kubelet\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896521 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-multus-socket-dir-parent\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896550 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfhhw\" (UniqueName: \"kubernetes.io/projected/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-kube-api-access-zfhhw\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896577 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/09957151-db22-4dfb-9042-61ea1fff6d0b-serviceca\") pod \"node-ca-rfmdv\" (UID: \"09957151-db22-4dfb-9042-61ea1fff6d0b\") " pod="openshift-image-registry/node-ca-rfmdv" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896601 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-sysctl-d\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896628 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-run-ovn\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896655 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a0b85ad2-2ebd-48c9-8952-e5d89d308e59-konnectivity-ca\") pod \"konnectivity-agent-nd4xv\" (UID: \"a0b85ad2-2ebd-48c9-8952-e5d89d308e59\") " pod="kube-system/konnectivity-agent-nd4xv" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896682 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-sys-fs\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896728 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kr4d\" (UniqueName: \"kubernetes.io/projected/09957151-db22-4dfb-9042-61ea1fff6d0b-kube-api-access-6kr4d\") pod \"node-ca-rfmdv\" (UID: \"09957151-db22-4dfb-9042-61ea1fff6d0b\") " pod="openshift-image-registry/node-ca-rfmdv" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896754 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-sysctl-conf\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896778 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d2765e9-81b0-417d-9da1-edd5712b0fc4-system-cni-dir\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896802 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5ac259a-b271-4b9e-9b77-a8a0b4118c19-hosts-file\") pod \"node-resolver-zdmch\" (UID: \"e5ac259a-b271-4b9e-9b77-a8a0b4118c19\") " pod="openshift-dns/node-resolver-zdmch" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896828 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-etc-kubernetes\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.899257 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896854 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-run-systemd\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.899976 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896879 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-run-k8s-cni-cncf-io\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.899976 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896902 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/09957151-db22-4dfb-9042-61ea1fff6d0b-serviceca\") pod \"node-ca-rfmdv\" (UID: \"09957151-db22-4dfb-9042-61ea1fff6d0b\") " pod="openshift-image-registry/node-ca-rfmdv" Apr 17 17:26:25.899976 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896969 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-sys-fs\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.899976 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896969 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05f5f413-d8bb-4636-a351-67da614740dd-ovnkube-config\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.899976 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.896985 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-sysctl-d\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.899976 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.897007 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d2765e9-81b0-417d-9da1-edd5712b0fc4-system-cni-dir\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.899976 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.897014 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-host-run-k8s-cni-cncf-io\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.899976 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.897044 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5ac259a-b271-4b9e-9b77-a8a0b4118c19-hosts-file\") pod \"node-resolver-zdmch\" (UID: \"e5ac259a-b271-4b9e-9b77-a8a0b4118c19\") " pod="openshift-dns/node-resolver-zdmch" Apr 17 17:26:25.899976 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.897080 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9fa17fb-8628-4f11-aa6a-a17be56e125d-etc-kubernetes\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.899976 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.897112 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-run-systemd\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.899976 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.897136 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-sysctl-conf\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.899976 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.897175 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/05f5f413-d8bb-4636-a351-67da614740dd-run-ovn\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.899976 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.897222 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a0b85ad2-2ebd-48c9-8952-e5d89d308e59-konnectivity-ca\") pod \"konnectivity-agent-nd4xv\" (UID: \"a0b85ad2-2ebd-48c9-8952-e5d89d308e59\") " pod="kube-system/konnectivity-agent-nd4xv" Apr 17 17:26:25.899976 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.898784 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05f5f413-d8bb-4636-a351-67da614740dd-ovn-node-metrics-cert\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.899976 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.899376 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3c27294e-af6d-4679-a6cf-cd522ae4d31e-etc-tuned\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.899976 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.899440 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3c27294e-af6d-4679-a6cf-cd522ae4d31e-tmp\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.899976 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.899912 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a0b85ad2-2ebd-48c9-8952-e5d89d308e59-agent-certs\") pod \"konnectivity-agent-nd4xv\" (UID: \"a0b85ad2-2ebd-48c9-8952-e5d89d308e59\") " pod="kube-system/konnectivity-agent-nd4xv" Apr 17 17:26:25.901051 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:25.900979 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:25.901051 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:25.901002 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:25.901051 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:25.901015 2571 projected.go:194] Error preparing data for projected volume kube-api-access-kc7sn for pod openshift-network-diagnostics/network-check-target-7m8gw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:25.901262 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:25.901076 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn podName:dc54b8b2-8ef6-49b7-afa2-35c2acaae914 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:26.401060434 +0000 UTC m=+3.074913752 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kc7sn" (UniqueName: "kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn") pod "network-check-target-7m8gw" (UID: "dc54b8b2-8ef6-49b7-afa2-35c2acaae914") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:25.903521 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.903484 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp7mp\" (UniqueName: \"kubernetes.io/projected/a9fa17fb-8628-4f11-aa6a-a17be56e125d-kube-api-access-gp7mp\") pod \"multus-glbqg\" (UID: \"a9fa17fb-8628-4f11-aa6a-a17be56e125d\") " pod="openshift-multus/multus-glbqg" Apr 17 17:26:25.903908 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.903860 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n98cp\" (UniqueName: \"kubernetes.io/projected/3c27294e-af6d-4679-a6cf-cd522ae4d31e-kube-api-access-n98cp\") pod \"tuned-7qhwz\" (UID: \"3c27294e-af6d-4679-a6cf-cd522ae4d31e\") " pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:25.904022 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.903968 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n574s\" (UniqueName: \"kubernetes.io/projected/e5ac259a-b271-4b9e-9b77-a8a0b4118c19-kube-api-access-n574s\") pod \"node-resolver-zdmch\" (UID: \"e5ac259a-b271-4b9e-9b77-a8a0b4118c19\") " pod="openshift-dns/node-resolver-zdmch" Apr 17 17:26:25.904800 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.904777 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69t2s\" (UniqueName: \"kubernetes.io/projected/05f5f413-d8bb-4636-a351-67da614740dd-kube-api-access-69t2s\") pod \"ovnkube-node-nz6mf\" (UID: \"05f5f413-d8bb-4636-a351-67da614740dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:25.905386 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.905363 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfhxz\" (UniqueName: \"kubernetes.io/projected/5d2765e9-81b0-417d-9da1-edd5712b0fc4-kube-api-access-rfhxz\") pod \"multus-additional-cni-plugins-qdxz9\" (UID: \"5d2765e9-81b0-417d-9da1-edd5712b0fc4\") " pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:25.905928 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.905900 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfhhw\" (UniqueName: \"kubernetes.io/projected/b8f4f4c2-154e-4ce8-9739-aa4b8553c79d-kube-api-access-zfhhw\") pod \"aws-ebs-csi-driver-node-jp7zx\" (UID: \"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:25.906569 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.906552 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sb7l\" (UniqueName: \"kubernetes.io/projected/8f82208d-ba82-434a-a209-d69847b4e54b-kube-api-access-4sb7l\") pod \"network-metrics-daemon-fczvt\" (UID: \"8f82208d-ba82-434a-a209-d69847b4e54b\") " pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:25.907111 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.907085 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kr4d\" (UniqueName: \"kubernetes.io/projected/09957151-db22-4dfb-9042-61ea1fff6d0b-kube-api-access-6kr4d\") pod \"node-ca-rfmdv\" (UID: \"09957151-db22-4dfb-9042-61ea1fff6d0b\") " pod="openshift-image-registry/node-ca-rfmdv" Apr 17 17:26:25.997559 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.997528 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ps86m\" (UniqueName: \"kubernetes.io/projected/94f7d770-f5f2-429d-8050-8a979402078e-kube-api-access-ps86m\") pod \"iptables-alerter-zpkb4\" (UID: \"94f7d770-f5f2-429d-8050-8a979402078e\") " pod="openshift-network-operator/iptables-alerter-zpkb4" Apr 17 17:26:25.997745 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.997602 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/94f7d770-f5f2-429d-8050-8a979402078e-host-slash\") pod \"iptables-alerter-zpkb4\" (UID: \"94f7d770-f5f2-429d-8050-8a979402078e\") " pod="openshift-network-operator/iptables-alerter-zpkb4" Apr 17 17:26:25.997745 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.997632 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/94f7d770-f5f2-429d-8050-8a979402078e-iptables-alerter-script\") pod \"iptables-alerter-zpkb4\" (UID: \"94f7d770-f5f2-429d-8050-8a979402078e\") " pod="openshift-network-operator/iptables-alerter-zpkb4" Apr 17 17:26:25.997745 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.997674 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/94f7d770-f5f2-429d-8050-8a979402078e-host-slash\") pod \"iptables-alerter-zpkb4\" (UID: \"94f7d770-f5f2-429d-8050-8a979402078e\") " pod="openshift-network-operator/iptables-alerter-zpkb4" Apr 17 17:26:25.998236 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:25.998215 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/94f7d770-f5f2-429d-8050-8a979402078e-iptables-alerter-script\") pod \"iptables-alerter-zpkb4\" (UID: \"94f7d770-f5f2-429d-8050-8a979402078e\") " pod="openshift-network-operator/iptables-alerter-zpkb4" Apr 17 17:26:26.005904 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.005853 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps86m\" (UniqueName: \"kubernetes.io/projected/94f7d770-f5f2-429d-8050-8a979402078e-kube-api-access-ps86m\") pod \"iptables-alerter-zpkb4\" (UID: \"94f7d770-f5f2-429d-8050-8a979402078e\") " pod="openshift-network-operator/iptables-alerter-zpkb4" Apr 17 17:26:26.080488 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.080425 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-glbqg" Apr 17 17:26:26.086168 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.086145 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" Apr 17 17:26:26.096681 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.096660 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:26.101274 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.101257 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nd4xv" Apr 17 17:26:26.108794 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.108777 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" Apr 17 17:26:26.115330 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.115314 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rfmdv" Apr 17 17:26:26.121782 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.121764 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qdxz9" Apr 17 17:26:26.129259 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.129241 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zdmch" Apr 17 17:26:26.130651 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.130635 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zpkb4" Apr 17 17:26:26.136858 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.136839 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:26.400177 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.400114 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs\") pod \"network-metrics-daemon-fczvt\" (UID: \"8f82208d-ba82-434a-a209-d69847b4e54b\") " pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:26.400293 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:26.400217 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:26.400293 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:26.400264 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs podName:8f82208d-ba82-434a-a209-d69847b4e54b nodeName:}" failed. No retries permitted until 2026-04-17 17:26:27.400251139 +0000 UTC m=+4.074104452 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs") pod "network-metrics-daemon-fczvt" (UID: "8f82208d-ba82-434a-a209-d69847b4e54b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:26.421740 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:26.421710 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05f5f413_d8bb_4636_a351_67da614740dd.slice/crio-f65e3374c47c089e119a243354b8969f51a24aba9c00868e8b6abb4f0af32fba WatchSource:0}: Error finding container f65e3374c47c089e119a243354b8969f51a24aba9c00868e8b6abb4f0af32fba: Status 404 returned error can't find the container with id f65e3374c47c089e119a243354b8969f51a24aba9c00868e8b6abb4f0af32fba Apr 17 17:26:26.423304 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:26.423201 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5ac259a_b271_4b9e_9b77_a8a0b4118c19.slice/crio-ef73c161aca33759d182810da84b99abe15f7ad76c619b64d09d95ffa2630b6a WatchSource:0}: Error finding container ef73c161aca33759d182810da84b99abe15f7ad76c619b64d09d95ffa2630b6a: Status 404 returned error can't find the container with id ef73c161aca33759d182810da84b99abe15f7ad76c619b64d09d95ffa2630b6a Apr 17 17:26:26.426827 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:26.426803 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c27294e_af6d_4679_a6cf_cd522ae4d31e.slice/crio-7421480d7c56229c31b52db0e179aa3f446b748a5ff2aa716deda54be7da43ae WatchSource:0}: Error finding container 7421480d7c56229c31b52db0e179aa3f446b748a5ff2aa716deda54be7da43ae: Status 404 returned error can't find the container with id 7421480d7c56229c31b52db0e179aa3f446b748a5ff2aa716deda54be7da43ae Apr 17 17:26:26.427659 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:26.427639 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09957151_db22_4dfb_9042_61ea1fff6d0b.slice/crio-2e1e6a12559099ca2c56fb914ecf19f2c383a2010f8839e8348ab9a2bdb51757 WatchSource:0}: Error finding container 2e1e6a12559099ca2c56fb914ecf19f2c383a2010f8839e8348ab9a2bdb51757: Status 404 returned error can't find the container with id 2e1e6a12559099ca2c56fb914ecf19f2c383a2010f8839e8348ab9a2bdb51757 Apr 17 17:26:26.428344 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:26.428320 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8f4f4c2_154e_4ce8_9739_aa4b8553c79d.slice/crio-b226c606091bc0fcc2857c2ac515df0a20a800296305ee32f72b940bc98977fd WatchSource:0}: Error finding container b226c606091bc0fcc2857c2ac515df0a20a800296305ee32f72b940bc98977fd: Status 404 returned error can't find the container with id b226c606091bc0fcc2857c2ac515df0a20a800296305ee32f72b940bc98977fd Apr 17 17:26:26.429056 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:26.429036 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94f7d770_f5f2_429d_8050_8a979402078e.slice/crio-516f41c37c3125d0d07a923733d909232044d04429f8e67f689a31bceb463aee WatchSource:0}: Error finding container 516f41c37c3125d0d07a923733d909232044d04429f8e67f689a31bceb463aee: Status 404 returned error can't find the container with id 516f41c37c3125d0d07a923733d909232044d04429f8e67f689a31bceb463aee Apr 17 17:26:26.429915 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:26.429802 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9fa17fb_8628_4f11_aa6a_a17be56e125d.slice/crio-3c9bd7341f9da8d3e4540068aa61c066b2f47131436edd4bf50df9ad512f6ffd WatchSource:0}: Error finding container 3c9bd7341f9da8d3e4540068aa61c066b2f47131436edd4bf50df9ad512f6ffd: Status 404 returned error can't find the container with id 3c9bd7341f9da8d3e4540068aa61c066b2f47131436edd4bf50df9ad512f6ffd Apr 17 17:26:26.430954 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:26.430886 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0b85ad2_2ebd_48c9_8952_e5d89d308e59.slice/crio-1399a0368b2c4d63e6e37ba6f3e7adc1ffb85fa4d5784aebd14f8d2ab7671271 WatchSource:0}: Error finding container 1399a0368b2c4d63e6e37ba6f3e7adc1ffb85fa4d5784aebd14f8d2ab7671271: Status 404 returned error can't find the container with id 1399a0368b2c4d63e6e37ba6f3e7adc1ffb85fa4d5784aebd14f8d2ab7671271 Apr 17 17:26:26.433126 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:26.433104 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d2765e9_81b0_417d_9da1_edd5712b0fc4.slice/crio-739c4e1b69d46f84b7274e7e5ee94df49eb664f4534edea58c642aeb27d04846 WatchSource:0}: Error finding container 739c4e1b69d46f84b7274e7e5ee94df49eb664f4534edea58c642aeb27d04846: Status 404 returned error can't find the container with id 739c4e1b69d46f84b7274e7e5ee94df49eb664f4534edea58c642aeb27d04846 Apr 17 17:26:26.501376 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.501256 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc7sn\" (UniqueName: \"kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn\") pod \"network-check-target-7m8gw\" (UID: \"dc54b8b2-8ef6-49b7-afa2-35c2acaae914\") " pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:26.501465 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:26.501400 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:26.501465 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:26.501417 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:26.501465 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:26.501426 2571 projected.go:194] Error preparing data for projected volume kube-api-access-kc7sn for pod openshift-network-diagnostics/network-check-target-7m8gw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:26.501607 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:26.501480 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn podName:dc54b8b2-8ef6-49b7-afa2-35c2acaae914 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:27.501460289 +0000 UTC m=+4.175313619 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kc7sn" (UniqueName: "kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn") pod "network-check-target-7m8gw" (UID: "dc54b8b2-8ef6-49b7-afa2-35c2acaae914") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:26.821483 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.821370 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:21:24 +0000 UTC" deadline="2027-11-17 07:56:51.787413515 +0000 UTC" Apr 17 17:26:26.821483 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.821411 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13886h30m24.966006883s" Apr 17 17:26:26.928412 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.928339 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdxz9" event={"ID":"5d2765e9-81b0-417d-9da1-edd5712b0fc4","Type":"ContainerStarted","Data":"739c4e1b69d46f84b7274e7e5ee94df49eb664f4534edea58c642aeb27d04846"} Apr 17 17:26:26.934916 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.934886 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nd4xv" event={"ID":"a0b85ad2-2ebd-48c9-8952-e5d89d308e59","Type":"ContainerStarted","Data":"1399a0368b2c4d63e6e37ba6f3e7adc1ffb85fa4d5784aebd14f8d2ab7671271"} Apr 17 17:26:26.938466 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.938434 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rfmdv" event={"ID":"09957151-db22-4dfb-9042-61ea1fff6d0b","Type":"ContainerStarted","Data":"2e1e6a12559099ca2c56fb914ecf19f2c383a2010f8839e8348ab9a2bdb51757"} Apr 17 17:26:26.941177 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.941152 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zdmch" event={"ID":"e5ac259a-b271-4b9e-9b77-a8a0b4118c19","Type":"ContainerStarted","Data":"ef73c161aca33759d182810da84b99abe15f7ad76c619b64d09d95ffa2630b6a"} Apr 17 17:26:26.945815 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.945126 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-42.ec2.internal" event={"ID":"12ee1495c5f896624a48870a38fd93a3","Type":"ContainerStarted","Data":"8298aa38ce0f605421eb18ef35d7398340b9c5116187a269b297313e866a503f"} Apr 17 17:26:26.955485 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.955460 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-glbqg" event={"ID":"a9fa17fb-8628-4f11-aa6a-a17be56e125d","Type":"ContainerStarted","Data":"3c9bd7341f9da8d3e4540068aa61c066b2f47131436edd4bf50df9ad512f6ffd"} Apr 17 17:26:26.961174 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.960710 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-42.ec2.internal" podStartSLOduration=1.960681309 podStartE2EDuration="1.960681309s" podCreationTimestamp="2026-04-17 17:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:26:26.959922062 +0000 UTC m=+3.633775399" watchObservedRunningTime="2026-04-17 17:26:26.960681309 +0000 UTC m=+3.634534661" Apr 17 17:26:26.967398 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.967344 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zpkb4" event={"ID":"94f7d770-f5f2-429d-8050-8a979402078e","Type":"ContainerStarted","Data":"516f41c37c3125d0d07a923733d909232044d04429f8e67f689a31bceb463aee"} Apr 17 17:26:26.974332 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.974285 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" event={"ID":"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d","Type":"ContainerStarted","Data":"b226c606091bc0fcc2857c2ac515df0a20a800296305ee32f72b940bc98977fd"} Apr 17 17:26:26.987321 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.983982 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" event={"ID":"3c27294e-af6d-4679-a6cf-cd522ae4d31e","Type":"ContainerStarted","Data":"7421480d7c56229c31b52db0e179aa3f446b748a5ff2aa716deda54be7da43ae"} Apr 17 17:26:26.987442 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:26.987336 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" event={"ID":"05f5f413-d8bb-4636-a351-67da614740dd","Type":"ContainerStarted","Data":"f65e3374c47c089e119a243354b8969f51a24aba9c00868e8b6abb4f0af32fba"} Apr 17 17:26:27.039350 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:27.039318 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:27.407204 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:27.407170 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs\") pod \"network-metrics-daemon-fczvt\" (UID: \"8f82208d-ba82-434a-a209-d69847b4e54b\") " pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:27.407402 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:27.407290 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:27.407402 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:27.407362 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs podName:8f82208d-ba82-434a-a209-d69847b4e54b nodeName:}" failed. No retries permitted until 2026-04-17 17:26:29.407342028 +0000 UTC m=+6.081195359 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs") pod "network-metrics-daemon-fczvt" (UID: "8f82208d-ba82-434a-a209-d69847b4e54b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:27.508521 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:27.507786 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc7sn\" (UniqueName: \"kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn\") pod \"network-check-target-7m8gw\" (UID: \"dc54b8b2-8ef6-49b7-afa2-35c2acaae914\") " pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:27.508521 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:27.508003 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:27.508521 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:27.508021 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:27.508521 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:27.508040 2571 projected.go:194] Error preparing data for projected volume kube-api-access-kc7sn for pod openshift-network-diagnostics/network-check-target-7m8gw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:27.508521 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:27.508098 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn podName:dc54b8b2-8ef6-49b7-afa2-35c2acaae914 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:29.508080847 +0000 UTC m=+6.181934184 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kc7sn" (UniqueName: "kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn") pod "network-check-target-7m8gw" (UID: "dc54b8b2-8ef6-49b7-afa2-35c2acaae914") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:27.914996 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:27.914865 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:27.915437 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:27.915031 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:26:27.915437 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:27.915417 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:27.915546 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:27.915499 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7m8gw" podUID="dc54b8b2-8ef6-49b7-afa2-35c2acaae914" Apr 17 17:26:27.992962 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:27.992930 2571 generic.go:358] "Generic (PLEG): container finished" podID="ddb715ec7aeed1631a73df97923d1472" containerID="e9fdfee26c19907a8830716e61c757fb8492c4d573999d6c26252dcc9b1decb2" exitCode=0 Apr 17 17:26:27.993133 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:27.993010 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal" event={"ID":"ddb715ec7aeed1631a73df97923d1472","Type":"ContainerDied","Data":"e9fdfee26c19907a8830716e61c757fb8492c4d573999d6c26252dcc9b1decb2"} Apr 17 17:26:28.997689 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:28.997648 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal" event={"ID":"ddb715ec7aeed1631a73df97923d1472","Type":"ContainerStarted","Data":"ea33d335c2b6a7b4cf1e11a3edf2d1a78de0f18aae61ee99587f87fd73e9d474"} Apr 17 17:26:29.425903 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:29.425607 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs\") pod \"network-metrics-daemon-fczvt\" (UID: \"8f82208d-ba82-434a-a209-d69847b4e54b\") " pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:29.425903 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:29.425789 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:29.425903 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:29.425857 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs podName:8f82208d-ba82-434a-a209-d69847b4e54b nodeName:}" failed. No retries permitted until 2026-04-17 17:26:33.425838553 +0000 UTC m=+10.099691881 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs") pod "network-metrics-daemon-fczvt" (UID: "8f82208d-ba82-434a-a209-d69847b4e54b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:29.526852 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:29.526814 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc7sn\" (UniqueName: \"kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn\") pod \"network-check-target-7m8gw\" (UID: \"dc54b8b2-8ef6-49b7-afa2-35c2acaae914\") " pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:29.527025 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:29.527006 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:29.527025 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:29.527025 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:29.527128 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:29.527036 2571 projected.go:194] Error preparing data for projected volume kube-api-access-kc7sn for pod openshift-network-diagnostics/network-check-target-7m8gw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:29.527128 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:29.527092 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn podName:dc54b8b2-8ef6-49b7-afa2-35c2acaae914 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:33.527072925 +0000 UTC m=+10.200926258 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kc7sn" (UniqueName: "kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn") pod "network-check-target-7m8gw" (UID: "dc54b8b2-8ef6-49b7-afa2-35c2acaae914") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:29.912728 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:29.912613 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:29.912970 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:29.912765 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7m8gw" podUID="dc54b8b2-8ef6-49b7-afa2-35c2acaae914" Apr 17 17:26:29.913109 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:29.912615 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:29.913239 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:29.913205 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:26:31.912597 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:31.911972 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:31.912597 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:31.912105 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7m8gw" podUID="dc54b8b2-8ef6-49b7-afa2-35c2acaae914" Apr 17 17:26:31.912597 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:31.911982 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:31.912597 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:31.912545 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:26:33.458210 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:33.458160 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs\") pod \"network-metrics-daemon-fczvt\" (UID: \"8f82208d-ba82-434a-a209-d69847b4e54b\") " pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:33.458661 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:33.458349 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:33.458661 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:33.458427 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs podName:8f82208d-ba82-434a-a209-d69847b4e54b nodeName:}" failed. No retries permitted until 2026-04-17 17:26:41.458404983 +0000 UTC m=+18.132258320 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs") pod "network-metrics-daemon-fczvt" (UID: "8f82208d-ba82-434a-a209-d69847b4e54b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:33.558947 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:33.558875 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc7sn\" (UniqueName: \"kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn\") pod \"network-check-target-7m8gw\" (UID: \"dc54b8b2-8ef6-49b7-afa2-35c2acaae914\") " pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:33.559127 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:33.559064 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:33.559127 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:33.559087 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:33.559127 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:33.559100 2571 projected.go:194] Error preparing data for projected volume kube-api-access-kc7sn for pod openshift-network-diagnostics/network-check-target-7m8gw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:33.559291 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:33.559159 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn podName:dc54b8b2-8ef6-49b7-afa2-35c2acaae914 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:41.559141451 +0000 UTC m=+18.232994771 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-kc7sn" (UniqueName: "kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn") pod "network-check-target-7m8gw" (UID: "dc54b8b2-8ef6-49b7-afa2-35c2acaae914") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:33.913283 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:33.912991 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:33.913283 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:33.913096 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7m8gw" podUID="dc54b8b2-8ef6-49b7-afa2-35c2acaae914" Apr 17 17:26:33.913283 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:33.913162 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:33.913283 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:33.913240 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:26:35.912528 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:35.912492 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:35.912973 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:35.912545 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:35.912973 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:35.912638 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:26:35.912973 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:35.912797 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7m8gw" podUID="dc54b8b2-8ef6-49b7-afa2-35c2acaae914" Apr 17 17:26:36.770555 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:36.770507 2571 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2f9b333f47df911dd05a9a10cb8ea988e97d3174490b348b8e46a161d9581d64: reading manifest sha256:38b41ae697f031205813679347380d7f258be2a57902ad4494285782a241086b in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2f9b333f47df911dd05a9a10cb8ea988e97d3174490b348b8e46a161d9581d64" Apr 17 17:26:36.770824 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:36.770759 2571 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:konnectivity-agent,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2f9b333f47df911dd05a9a10cb8ea988e97d3174490b348b8e46a161d9581d64,Command:[/usr/bin/proxy-agent],Args:[--logtostderr=true --ca-cert /etc/konnectivity/ca/ca.crt --agent-cert /etc/konnectivity/agent/tls.crt --agent-key /etc/konnectivity/agent/tls.key --proxy-server-host konnectivity-server-clusters-e76ed338-6922-4902-af47--7f93713a.apps.kflux-prd-es01.1ion.p1.openshiftapps.com --proxy-server-port 443 --health-server-port 2041 --agent-identifiers=default-route=true --keepalive-time 30s --probe-interval 5s --sync-interval 5s --sync-interval-cap 30s --v 3],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:HTTP_PROXY,Value:,ValueFrom:nil,},EnvVar{Name:HTTPS_PROXY,Value:,ValueFrom:nil,},EnvVar{Name:NO_PROXY,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{40 -3} {} 40m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:agent-certs,ReadOnly:false,MountPath:/etc/konnectivity/agent,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:konnectivity-ca,ReadOnly:false,MountPath:/etc/konnectivity/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 2041 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:readyz,Port:{0 2041 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:1,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 2041 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:60,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod konnectivity-agent-nd4xv_kube-system(a0b85ad2-2ebd-48c9-8952-e5d89d308e59): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2f9b333f47df911dd05a9a10cb8ea988e97d3174490b348b8e46a161d9581d64: reading manifest sha256:38b41ae697f031205813679347380d7f258be2a57902ad4494285782a241086b in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 17:26:36.771964 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:36.771924 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"konnectivity-agent\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2f9b333f47df911dd05a9a10cb8ea988e97d3174490b348b8e46a161d9581d64: reading manifest sha256:38b41ae697f031205813679347380d7f258be2a57902ad4494285782a241086b in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="kube-system/konnectivity-agent-nd4xv" podUID="a0b85ad2-2ebd-48c9-8952-e5d89d308e59" Apr 17 17:26:37.012384 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:37.012339 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"konnectivity-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2f9b333f47df911dd05a9a10cb8ea988e97d3174490b348b8e46a161d9581d64\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2f9b333f47df911dd05a9a10cb8ea988e97d3174490b348b8e46a161d9581d64: reading manifest sha256:38b41ae697f031205813679347380d7f258be2a57902ad4494285782a241086b in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="kube-system/konnectivity-agent-nd4xv" podUID="a0b85ad2-2ebd-48c9-8952-e5d89d308e59" Apr 17 17:26:37.029294 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:37.029247 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-42.ec2.internal" podStartSLOduration=12.029235397 podStartE2EDuration="12.029235397s" podCreationTimestamp="2026-04-17 17:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:26:29.025985808 +0000 UTC m=+5.699839146" watchObservedRunningTime="2026-04-17 17:26:37.029235397 +0000 UTC m=+13.703088732" Apr 17 17:26:37.912169 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:37.912131 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:37.912169 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:37.912163 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:37.912417 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:37.912276 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:26:37.912473 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:37.912409 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7m8gw" podUID="dc54b8b2-8ef6-49b7-afa2-35c2acaae914" Apr 17 17:26:39.912806 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:39.912764 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:39.913222 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:39.912818 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:39.913222 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:39.912911 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:26:39.913222 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:39.913043 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7m8gw" podUID="dc54b8b2-8ef6-49b7-afa2-35c2acaae914" Apr 17 17:26:41.516922 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:41.516842 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs\") pod \"network-metrics-daemon-fczvt\" (UID: \"8f82208d-ba82-434a-a209-d69847b4e54b\") " pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:41.517305 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:41.516939 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:41.517305 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:41.517013 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs podName:8f82208d-ba82-434a-a209-d69847b4e54b nodeName:}" failed. No retries permitted until 2026-04-17 17:26:57.516993537 +0000 UTC m=+34.190846852 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs") pod "network-metrics-daemon-fczvt" (UID: "8f82208d-ba82-434a-a209-d69847b4e54b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:41.617394 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:41.617363 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc7sn\" (UniqueName: \"kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn\") pod \"network-check-target-7m8gw\" (UID: \"dc54b8b2-8ef6-49b7-afa2-35c2acaae914\") " pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:41.617582 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:41.617510 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:41.617582 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:41.617531 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:41.617582 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:41.617541 2571 projected.go:194] Error preparing data for projected volume kube-api-access-kc7sn for pod openshift-network-diagnostics/network-check-target-7m8gw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:41.617740 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:41.617592 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn podName:dc54b8b2-8ef6-49b7-afa2-35c2acaae914 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:57.617574501 +0000 UTC m=+34.291427829 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-kc7sn" (UniqueName: "kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn") pod "network-check-target-7m8gw" (UID: "dc54b8b2-8ef6-49b7-afa2-35c2acaae914") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:41.912863 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:41.912784 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:41.913017 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:41.912896 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:26:41.913017 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:41.912955 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:41.913118 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:41.913036 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7m8gw" podUID="dc54b8b2-8ef6-49b7-afa2-35c2acaae914" Apr 17 17:26:43.022358 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:43.022112 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdxz9" event={"ID":"5d2765e9-81b0-417d-9da1-edd5712b0fc4","Type":"ContainerStarted","Data":"8e3abd5cbfe21fa6b9a41dd14bdfacc668449a1731dc9dca72333a964e9e833b"} Apr 17 17:26:43.023572 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:43.023539 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rfmdv" event={"ID":"09957151-db22-4dfb-9042-61ea1fff6d0b","Type":"ContainerStarted","Data":"b1e6a771f5338175a1d566acb775ee7bd29891814c84985ba5d3d9325cbf84ee"} Apr 17 17:26:43.024839 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:43.024821 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zdmch" event={"ID":"e5ac259a-b271-4b9e-9b77-a8a0b4118c19","Type":"ContainerStarted","Data":"8877d109a8764398b808acf9b8d6a31c83b7ee9dfd69619898c04b84911c458e"} Apr 17 17:26:43.026039 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:43.026020 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-glbqg" event={"ID":"a9fa17fb-8628-4f11-aa6a-a17be56e125d","Type":"ContainerStarted","Data":"d9497f3cc341ff175235b009eb815ef2800cef43a7bda6137ee73ddf8ab4c2e1"} Apr 17 17:26:43.027255 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:43.027236 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" event={"ID":"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d","Type":"ContainerStarted","Data":"aea09a7c3cd0e031922c0daf5d3254b0da1c6f0480f05eaa9fba25538528928a"} Apr 17 17:26:43.028389 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:43.028372 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" event={"ID":"3c27294e-af6d-4679-a6cf-cd522ae4d31e","Type":"ContainerStarted","Data":"65cf245c1b3daa5aa81cfdc4738c45cb13e4f9bee87cfa2156d96a693b276588"} Apr 17 17:26:43.107043 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:43.106997 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7qhwz" podStartSLOduration=2.859892998 podStartE2EDuration="19.106983835s" podCreationTimestamp="2026-04-17 17:26:24 +0000 UTC" firstStartedPulling="2026-04-17 17:26:26.428467974 +0000 UTC m=+3.102321293" lastFinishedPulling="2026-04-17 17:26:42.675558802 +0000 UTC m=+19.349412130" observedRunningTime="2026-04-17 17:26:43.083791746 +0000 UTC m=+19.757645082" watchObservedRunningTime="2026-04-17 17:26:43.106983835 +0000 UTC m=+19.780837214" Apr 17 17:26:43.107196 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:43.107177 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-glbqg" podStartSLOduration=3.674720786 podStartE2EDuration="20.107172604s" podCreationTimestamp="2026-04-17 17:26:23 +0000 UTC" firstStartedPulling="2026-04-17 17:26:26.431980534 +0000 UTC m=+3.105833851" lastFinishedPulling="2026-04-17 17:26:42.86443234 +0000 UTC m=+19.538285669" observedRunningTime="2026-04-17 17:26:43.106455239 +0000 UTC m=+19.780308609" watchObservedRunningTime="2026-04-17 17:26:43.107172604 +0000 UTC m=+19.781025940" Apr 17 17:26:43.136212 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:43.136173 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rfmdv" podStartSLOduration=2.891323993 podStartE2EDuration="19.136162549s" podCreationTimestamp="2026-04-17 17:26:24 +0000 UTC" firstStartedPulling="2026-04-17 17:26:26.429271167 +0000 UTC m=+3.103124481" lastFinishedPulling="2026-04-17 17:26:42.674109723 +0000 UTC m=+19.347963037" observedRunningTime="2026-04-17 17:26:43.135357228 +0000 UTC m=+19.809210573" watchObservedRunningTime="2026-04-17 17:26:43.136162549 +0000 UTC m=+19.810015884" Apr 17 17:26:43.164456 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:43.164402 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zdmch" podStartSLOduration=2.9159452999999997 podStartE2EDuration="19.164383343s" podCreationTimestamp="2026-04-17 17:26:24 +0000 UTC" firstStartedPulling="2026-04-17 17:26:26.425239695 +0000 UTC m=+3.099093009" lastFinishedPulling="2026-04-17 17:26:42.673677725 +0000 UTC m=+19.347531052" observedRunningTime="2026-04-17 17:26:43.163913912 +0000 UTC m=+19.837767248" watchObservedRunningTime="2026-04-17 17:26:43.164383343 +0000 UTC m=+19.838236681" Apr 17 17:26:43.912680 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:43.912519 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:43.912799 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:43.912583 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:43.912799 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:43.912759 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7m8gw" podUID="dc54b8b2-8ef6-49b7-afa2-35c2acaae914" Apr 17 17:26:43.912886 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:43.912856 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:26:44.031503 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:44.031436 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zpkb4" event={"ID":"94f7d770-f5f2-429d-8050-8a979402078e","Type":"ContainerStarted","Data":"c70df7bc18e0b48d1f8e01c7eda984eb68b62ccc80e555b97c65bcb56e1b644a"} Apr 17 17:26:44.033772 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:44.033750 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" event={"ID":"05f5f413-d8bb-4636-a351-67da614740dd","Type":"ContainerStarted","Data":"adee7f996278b852e8913f347b1e61eb20826639e3680110ac4fb6855b93acc2"} Apr 17 17:26:44.033864 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:44.033779 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" event={"ID":"05f5f413-d8bb-4636-a351-67da614740dd","Type":"ContainerStarted","Data":"21189e59bef95d147ecfa62e0151f9b26090f1dea5d854e6c388b6c241dac0cd"} Apr 17 17:26:44.033864 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:44.033794 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" event={"ID":"05f5f413-d8bb-4636-a351-67da614740dd","Type":"ContainerStarted","Data":"de3ac8871f8e6516073bba9f1268ec475c0e0bacd28018ca2e3c1eeddff0ad2e"} Apr 17 17:26:44.033864 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:44.033803 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" event={"ID":"05f5f413-d8bb-4636-a351-67da614740dd","Type":"ContainerStarted","Data":"b6e8582b135bb3e9099ae25d31cc7e3fb296d8646884bc6073c7385baea53f98"} Apr 17 17:26:44.033864 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:44.033811 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" event={"ID":"05f5f413-d8bb-4636-a351-67da614740dd","Type":"ContainerStarted","Data":"7791d685fc4dd199b6d48f3d9cd9a435fec758578abc563c8f142acb50edabf4"} Apr 17 17:26:44.034943 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:44.034921 2571 generic.go:358] "Generic (PLEG): container finished" podID="5d2765e9-81b0-417d-9da1-edd5712b0fc4" containerID="8e3abd5cbfe21fa6b9a41dd14bdfacc668449a1731dc9dca72333a964e9e833b" exitCode=0 Apr 17 17:26:44.035083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:44.035044 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdxz9" event={"ID":"5d2765e9-81b0-417d-9da1-edd5712b0fc4","Type":"ContainerDied","Data":"8e3abd5cbfe21fa6b9a41dd14bdfacc668449a1731dc9dca72333a964e9e833b"} Apr 17 17:26:44.046973 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:44.046938 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zpkb4" podStartSLOduration=3.804492338 podStartE2EDuration="20.046927441s" podCreationTimestamp="2026-04-17 17:26:24 +0000 UTC" firstStartedPulling="2026-04-17 17:26:26.431260244 +0000 UTC m=+3.105113574" lastFinishedPulling="2026-04-17 17:26:42.673695359 +0000 UTC m=+19.347548677" observedRunningTime="2026-04-17 17:26:44.0466383 +0000 UTC m=+20.720491637" watchObservedRunningTime="2026-04-17 17:26:44.046927441 +0000 UTC m=+20.720780809" Apr 17 17:26:44.318641 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:44.318610 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:26:44.850872 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:44.850766 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:26:44.318635079Z","UUID":"68d7ef93-386f-4cd6-a603-6b71ccaa1835","Handler":null,"Name":"","Endpoint":""} Apr 17 17:26:44.854355 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:44.854334 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:26:44.854355 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:44.854360 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:26:45.039351 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:45.039312 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" event={"ID":"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d","Type":"ContainerStarted","Data":"9d908802623c0cfa8a44a79e24adf66123157c5435dd86a79e645ca282b1de7d"} Apr 17 17:26:45.042752 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:45.042726 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" event={"ID":"05f5f413-d8bb-4636-a351-67da614740dd","Type":"ContainerStarted","Data":"bcfcb4af2fcf4458e8d7296cd51bbe75bd27ec6eb7870c03013c1768eb66aa06"} Apr 17 17:26:45.912645 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:45.912613 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:45.912645 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:45.912648 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:45.912992 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:45.912760 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:26:45.912992 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:45.912906 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7m8gw" podUID="dc54b8b2-8ef6-49b7-afa2-35c2acaae914" Apr 17 17:26:46.046313 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:46.046279 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" event={"ID":"b8f4f4c2-154e-4ce8-9739-aa4b8553c79d","Type":"ContainerStarted","Data":"915f000253a22f5f532995b0932d16b735fa5bb903ea1b9c915522263af6f462"} Apr 17 17:26:46.065355 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:46.065298 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jp7zx" podStartSLOduration=4.254637323 podStartE2EDuration="23.065280986s" podCreationTimestamp="2026-04-17 17:26:23 +0000 UTC" firstStartedPulling="2026-04-17 17:26:26.43050183 +0000 UTC m=+3.104355146" lastFinishedPulling="2026-04-17 17:26:45.241145489 +0000 UTC m=+21.914998809" observedRunningTime="2026-04-17 17:26:46.064991692 +0000 UTC m=+22.738845030" watchObservedRunningTime="2026-04-17 17:26:46.065280986 +0000 UTC m=+22.739134325" Apr 17 17:26:47.051223 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:47.051011 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" event={"ID":"05f5f413-d8bb-4636-a351-67da614740dd","Type":"ContainerStarted","Data":"ae38808d47605e89e638927cbdb2ad7aef2d9f3776f00b1ed6bf451990aba35a"} Apr 17 17:26:47.912269 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:47.912238 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:47.912446 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:47.912356 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7m8gw" podUID="dc54b8b2-8ef6-49b7-afa2-35c2acaae914" Apr 17 17:26:47.912446 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:47.912413 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:47.912562 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:47.912536 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:26:49.056289 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:49.056257 2571 generic.go:358] "Generic (PLEG): container finished" podID="5d2765e9-81b0-417d-9da1-edd5712b0fc4" containerID="e589fcde18854aee0a6ec76e57023fa4daefe1002a070c208c8fe30479656e37" exitCode=0 Apr 17 17:26:49.057152 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:49.056346 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdxz9" event={"ID":"5d2765e9-81b0-417d-9da1-edd5712b0fc4","Type":"ContainerDied","Data":"e589fcde18854aee0a6ec76e57023fa4daefe1002a070c208c8fe30479656e37"} Apr 17 17:26:49.059837 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:49.059655 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" event={"ID":"05f5f413-d8bb-4636-a351-67da614740dd","Type":"ContainerStarted","Data":"28c1b301af61e7eaa1f1a13d50f41c4c884a78ca290c4daaf07b4a82f1b82e9e"} Apr 17 17:26:49.059966 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:49.059941 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:49.060062 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:49.059972 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:49.074000 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:49.073978 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:49.104176 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:49.104141 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" podStartSLOduration=9.231784593 podStartE2EDuration="26.104130447s" podCreationTimestamp="2026-04-17 17:26:23 +0000 UTC" firstStartedPulling="2026-04-17 17:26:26.423859427 +0000 UTC m=+3.097712741" lastFinishedPulling="2026-04-17 17:26:43.296205281 +0000 UTC m=+19.970058595" observedRunningTime="2026-04-17 17:26:49.103661997 +0000 UTC m=+25.777515332" watchObservedRunningTime="2026-04-17 17:26:49.104130447 +0000 UTC m=+25.777983803" Apr 17 17:26:49.911923 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:49.911903 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:49.912037 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:49.911975 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:49.912112 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:49.912091 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7m8gw" podUID="dc54b8b2-8ef6-49b7-afa2-35c2acaae914" Apr 17 17:26:49.912252 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:49.912227 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:26:50.063643 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:50.063574 2571 generic.go:358] "Generic (PLEG): container finished" podID="5d2765e9-81b0-417d-9da1-edd5712b0fc4" containerID="5ad17f535d2e75827867467850e8199516d41b03adb08988125ec3d14d0594d4" exitCode=0 Apr 17 17:26:50.064017 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:50.063656 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdxz9" event={"ID":"5d2765e9-81b0-417d-9da1-edd5712b0fc4","Type":"ContainerDied","Data":"5ad17f535d2e75827867467850e8199516d41b03adb08988125ec3d14d0594d4"} Apr 17 17:26:50.064833 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:50.064348 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:50.085759 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:50.085715 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:26:50.539974 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:50.539944 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fczvt"] Apr 17 17:26:50.540132 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:50.540097 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:50.540223 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:50.540198 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:26:50.551334 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:50.551311 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7m8gw"] Apr 17 17:26:50.551463 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:50.551401 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:50.551511 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:50.551468 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7m8gw" podUID="dc54b8b2-8ef6-49b7-afa2-35c2acaae914" Apr 17 17:26:51.068077 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:51.068042 2571 generic.go:358] "Generic (PLEG): container finished" podID="5d2765e9-81b0-417d-9da1-edd5712b0fc4" containerID="0840d77fc26fa38a33fcab7ff478f2f610431d390997c9d1dd37813c3939ed10" exitCode=0 Apr 17 17:26:51.068492 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:51.068137 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdxz9" event={"ID":"5d2765e9-81b0-417d-9da1-edd5712b0fc4","Type":"ContainerDied","Data":"0840d77fc26fa38a33fcab7ff478f2f610431d390997c9d1dd37813c3939ed10"} Apr 17 17:26:51.912129 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:51.912095 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:51.912292 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:51.912258 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:26:52.912856 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:52.912565 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:52.913272 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:52.912906 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7m8gw" podUID="dc54b8b2-8ef6-49b7-afa2-35c2acaae914" Apr 17 17:26:53.074969 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:53.074936 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nd4xv" event={"ID":"a0b85ad2-2ebd-48c9-8952-e5d89d308e59","Type":"ContainerStarted","Data":"4cfe2d5db0b31219fde45835a78f002fe665be66daa306eaec31ade141ac1808"} Apr 17 17:26:53.093140 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:53.093081 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nd4xv" podStartSLOduration=4.207516454 podStartE2EDuration="30.093061721s" podCreationTimestamp="2026-04-17 17:26:23 +0000 UTC" firstStartedPulling="2026-04-17 17:26:26.43297689 +0000 UTC m=+3.106830217" lastFinishedPulling="2026-04-17 17:26:52.318521975 +0000 UTC m=+28.992375484" observedRunningTime="2026-04-17 17:26:53.092120004 +0000 UTC m=+29.765973341" watchObservedRunningTime="2026-04-17 17:26:53.093061721 +0000 UTC m=+29.766915057" Apr 17 17:26:53.913905 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:53.913873 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:53.914455 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:53.914001 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:26:54.912613 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:54.912582 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:54.912803 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:54.912722 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7m8gw" podUID="dc54b8b2-8ef6-49b7-afa2-35c2acaae914" Apr 17 17:26:55.309136 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:55.309102 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nd4xv" Apr 17 17:26:55.310295 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:55.310272 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nd4xv" Apr 17 17:26:55.912372 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:55.912336 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:55.912521 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:55.912476 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:26:56.081164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.081121 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nd4xv" Apr 17 17:26:56.081803 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.081766 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nd4xv" Apr 17 17:26:56.627566 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.627534 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-42.ec2.internal" event="NodeReady" Apr 17 17:26:56.628215 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.627674 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:26:56.674144 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.674065 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rr5tw"] Apr 17 17:26:56.678474 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.678451 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zflhg"] Apr 17 17:26:56.678640 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.678613 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rr5tw" Apr 17 17:26:56.681234 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.681214 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:26:56.681365 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.681251 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dxpw7\"" Apr 17 17:26:56.681365 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.681317 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:26:56.682038 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.681749 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zflhg" Apr 17 17:26:56.683155 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.683121 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zflhg"] Apr 17 17:26:56.684203 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.684183 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:26:56.684304 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.684249 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:26:56.685746 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.685725 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:26:56.685947 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.685929 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-29fk4\"" Apr 17 17:26:56.687269 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.687249 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rr5tw"] Apr 17 17:26:56.827763 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.827721 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5f010296-632c-435e-b1db-62d8eeeae050-tmp-dir\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:26:56.827929 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.827775 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chx62\" (UniqueName: \"kubernetes.io/projected/d55bc00d-c046-4764-9cfb-801efb7b23b8-kube-api-access-chx62\") pod \"ingress-canary-zflhg\" (UID: \"d55bc00d-c046-4764-9cfb-801efb7b23b8\") " pod="openshift-ingress-canary/ingress-canary-zflhg" Apr 17 17:26:56.827929 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.827810 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert\") pod \"ingress-canary-zflhg\" (UID: \"d55bc00d-c046-4764-9cfb-801efb7b23b8\") " pod="openshift-ingress-canary/ingress-canary-zflhg" Apr 17 17:26:56.827929 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.827896 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9kp6\" (UniqueName: \"kubernetes.io/projected/5f010296-632c-435e-b1db-62d8eeeae050-kube-api-access-x9kp6\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:26:56.828058 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.827938 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f010296-632c-435e-b1db-62d8eeeae050-config-volume\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:26:56.828058 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.827965 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:26:56.912231 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.912194 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:56.915121 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.915091 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:26:56.915255 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.915140 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:26:56.915255 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.915150 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-r46n5\"" Apr 17 17:26:56.928607 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.928553 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9kp6\" (UniqueName: \"kubernetes.io/projected/5f010296-632c-435e-b1db-62d8eeeae050-kube-api-access-x9kp6\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:26:56.928732 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.928605 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f010296-632c-435e-b1db-62d8eeeae050-config-volume\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:26:56.928800 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.928754 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:26:56.928854 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.928796 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5f010296-632c-435e-b1db-62d8eeeae050-tmp-dir\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:26:56.928854 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.928824 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chx62\" (UniqueName: \"kubernetes.io/projected/d55bc00d-c046-4764-9cfb-801efb7b23b8-kube-api-access-chx62\") pod \"ingress-canary-zflhg\" (UID: \"d55bc00d-c046-4764-9cfb-801efb7b23b8\") " pod="openshift-ingress-canary/ingress-canary-zflhg" Apr 17 17:26:56.928958 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.928855 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert\") pod \"ingress-canary-zflhg\" (UID: \"d55bc00d-c046-4764-9cfb-801efb7b23b8\") " pod="openshift-ingress-canary/ingress-canary-zflhg" Apr 17 17:26:56.928958 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:56.928917 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:56.929050 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:56.928984 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:56.929050 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:56.929001 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls podName:5f010296-632c-435e-b1db-62d8eeeae050 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:57.428979096 +0000 UTC m=+34.102832413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls") pod "dns-default-rr5tw" (UID: "5f010296-632c-435e-b1db-62d8eeeae050") : secret "dns-default-metrics-tls" not found Apr 17 17:26:56.929050 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:56.929039 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert podName:d55bc00d-c046-4764-9cfb-801efb7b23b8 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:57.429022681 +0000 UTC m=+34.102876002 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert") pod "ingress-canary-zflhg" (UID: "d55bc00d-c046-4764-9cfb-801efb7b23b8") : secret "canary-serving-cert" not found Apr 17 17:26:56.929158 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.929066 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5f010296-632c-435e-b1db-62d8eeeae050-tmp-dir\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:26:56.929613 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.929595 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f010296-632c-435e-b1db-62d8eeeae050-config-volume\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:26:56.940424 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.940401 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9kp6\" (UniqueName: \"kubernetes.io/projected/5f010296-632c-435e-b1db-62d8eeeae050-kube-api-access-x9kp6\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:26:56.940557 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:56.940532 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chx62\" (UniqueName: \"kubernetes.io/projected/d55bc00d-c046-4764-9cfb-801efb7b23b8-kube-api-access-chx62\") pod \"ingress-canary-zflhg\" (UID: \"d55bc00d-c046-4764-9cfb-801efb7b23b8\") " pod="openshift-ingress-canary/ingress-canary-zflhg" Apr 17 17:26:57.433350 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:57.433309 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:26:57.433525 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:57.433360 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert\") pod \"ingress-canary-zflhg\" (UID: \"d55bc00d-c046-4764-9cfb-801efb7b23b8\") " pod="openshift-ingress-canary/ingress-canary-zflhg" Apr 17 17:26:57.433525 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:57.433459 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:57.433595 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:57.433459 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:57.433595 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:57.433558 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert podName:d55bc00d-c046-4764-9cfb-801efb7b23b8 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:58.43353733 +0000 UTC m=+35.107390643 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert") pod "ingress-canary-zflhg" (UID: "d55bc00d-c046-4764-9cfb-801efb7b23b8") : secret "canary-serving-cert" not found Apr 17 17:26:57.433663 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:57.433595 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls podName:5f010296-632c-435e-b1db-62d8eeeae050 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:58.433584201 +0000 UTC m=+35.107437520 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls") pod "dns-default-rr5tw" (UID: "5f010296-632c-435e-b1db-62d8eeeae050") : secret "dns-default-metrics-tls" not found Apr 17 17:26:57.534608 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:57.534572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs\") pod \"network-metrics-daemon-fczvt\" (UID: \"8f82208d-ba82-434a-a209-d69847b4e54b\") " pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:57.534795 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:57.534746 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:57.534835 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:57.534810 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs podName:8f82208d-ba82-434a-a209-d69847b4e54b nodeName:}" failed. No retries permitted until 2026-04-17 17:27:29.534793254 +0000 UTC m=+66.208646576 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs") pod "network-metrics-daemon-fczvt" (UID: "8f82208d-ba82-434a-a209-d69847b4e54b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:57.635119 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:57.635082 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc7sn\" (UniqueName: \"kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn\") pod \"network-check-target-7m8gw\" (UID: \"dc54b8b2-8ef6-49b7-afa2-35c2acaae914\") " pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:57.638844 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:57.638798 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc7sn\" (UniqueName: \"kubernetes.io/projected/dc54b8b2-8ef6-49b7-afa2-35c2acaae914-kube-api-access-kc7sn\") pod \"network-check-target-7m8gw\" (UID: \"dc54b8b2-8ef6-49b7-afa2-35c2acaae914\") " pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:57.822304 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:57.822263 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:26:57.912129 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:57.911966 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:26:57.916127 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:57.915967 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q7kb6\"" Apr 17 17:26:57.916127 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:57.916004 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:26:57.974978 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:57.974861 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7m8gw"] Apr 17 17:26:57.980788 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:26:57.980752 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc54b8b2_8ef6_49b7_afa2_35c2acaae914.slice/crio-1e7f3f11eb1c8008cf671307bb4ad6d3250a10d5f82914d9710c6217384448e7 WatchSource:0}: Error finding container 1e7f3f11eb1c8008cf671307bb4ad6d3250a10d5f82914d9710c6217384448e7: Status 404 returned error can't find the container with id 1e7f3f11eb1c8008cf671307bb4ad6d3250a10d5f82914d9710c6217384448e7 Apr 17 17:26:58.087403 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:58.087366 2571 generic.go:358] "Generic (PLEG): container finished" podID="5d2765e9-81b0-417d-9da1-edd5712b0fc4" containerID="6ba6e5cba016a4fe61de2bf844611633085ed40ab126ddd4f3e7f6e533a09f0b" exitCode=0 Apr 17 17:26:58.087550 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:58.087451 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdxz9" event={"ID":"5d2765e9-81b0-417d-9da1-edd5712b0fc4","Type":"ContainerDied","Data":"6ba6e5cba016a4fe61de2bf844611633085ed40ab126ddd4f3e7f6e533a09f0b"} Apr 17 17:26:58.088477 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:58.088454 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7m8gw" event={"ID":"dc54b8b2-8ef6-49b7-afa2-35c2acaae914","Type":"ContainerStarted","Data":"1e7f3f11eb1c8008cf671307bb4ad6d3250a10d5f82914d9710c6217384448e7"} Apr 17 17:26:58.442864 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:58.442830 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert\") pod \"ingress-canary-zflhg\" (UID: \"d55bc00d-c046-4764-9cfb-801efb7b23b8\") " pod="openshift-ingress-canary/ingress-canary-zflhg" Apr 17 17:26:58.443051 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:58.443030 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:26:58.443119 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:58.443034 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:58.443165 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:58.443118 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert podName:d55bc00d-c046-4764-9cfb-801efb7b23b8 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:00.443097649 +0000 UTC m=+37.116950974 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert") pod "ingress-canary-zflhg" (UID: "d55bc00d-c046-4764-9cfb-801efb7b23b8") : secret "canary-serving-cert" not found Apr 17 17:26:58.443165 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:58.443138 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:58.443268 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:26:58.443212 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls podName:5f010296-632c-435e-b1db-62d8eeeae050 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:00.443194896 +0000 UTC m=+37.117048213 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls") pod "dns-default-rr5tw" (UID: "5f010296-632c-435e-b1db-62d8eeeae050") : secret "dns-default-metrics-tls" not found Apr 17 17:26:59.093772 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:59.093677 2571 generic.go:358] "Generic (PLEG): container finished" podID="5d2765e9-81b0-417d-9da1-edd5712b0fc4" containerID="5e763723b21601eb11cf20b814b8728352a54341a1fff8a36b39ff3695ba1c5e" exitCode=0 Apr 17 17:26:59.094255 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:26:59.093805 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdxz9" event={"ID":"5d2765e9-81b0-417d-9da1-edd5712b0fc4","Type":"ContainerDied","Data":"5e763723b21601eb11cf20b814b8728352a54341a1fff8a36b39ff3695ba1c5e"} Apr 17 17:27:00.100061 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:00.099792 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdxz9" event={"ID":"5d2765e9-81b0-417d-9da1-edd5712b0fc4","Type":"ContainerStarted","Data":"1117d9c0d342ccadedf417f3c974aad5c813d174a7b3ee7ecea397136f8fd07e"} Apr 17 17:27:00.126660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:00.126598 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qdxz9" podStartSLOduration=4.997241956 podStartE2EDuration="36.126581229s" podCreationTimestamp="2026-04-17 17:26:24 +0000 UTC" firstStartedPulling="2026-04-17 17:26:26.434751794 +0000 UTC m=+3.108605108" lastFinishedPulling="2026-04-17 17:26:57.564091064 +0000 UTC m=+34.237944381" observedRunningTime="2026-04-17 17:27:00.125266031 +0000 UTC m=+36.799119379" watchObservedRunningTime="2026-04-17 17:27:00.126581229 +0000 UTC m=+36.800434566" Apr 17 17:27:00.460474 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:00.460438 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:27:00.460678 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:00.460499 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert\") pod \"ingress-canary-zflhg\" (UID: \"d55bc00d-c046-4764-9cfb-801efb7b23b8\") " pod="openshift-ingress-canary/ingress-canary-zflhg" Apr 17 17:27:00.460678 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:00.460610 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:27:00.460820 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:00.460682 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls podName:5f010296-632c-435e-b1db-62d8eeeae050 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:04.460665524 +0000 UTC m=+41.134518859 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls") pod "dns-default-rr5tw" (UID: "5f010296-632c-435e-b1db-62d8eeeae050") : secret "dns-default-metrics-tls" not found Apr 17 17:27:00.460820 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:00.460621 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:27:00.460820 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:00.460782 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert podName:d55bc00d-c046-4764-9cfb-801efb7b23b8 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:04.460749368 +0000 UTC m=+41.134602682 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert") pod "ingress-canary-zflhg" (UID: "d55bc00d-c046-4764-9cfb-801efb7b23b8") : secret "canary-serving-cert" not found Apr 17 17:27:02.104428 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:02.104387 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7m8gw" event={"ID":"dc54b8b2-8ef6-49b7-afa2-35c2acaae914","Type":"ContainerStarted","Data":"8ed3651d00febbdb64fb33c281718b0c7d9f913157a2c418d2e3037cf4c6ec5e"} Apr 17 17:27:02.104993 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:02.104604 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:27:02.123173 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:02.123127 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-7m8gw" podStartSLOduration=34.999608819 podStartE2EDuration="38.123114564s" podCreationTimestamp="2026-04-17 17:26:24 +0000 UTC" firstStartedPulling="2026-04-17 17:26:57.982711497 +0000 UTC m=+34.656564823" lastFinishedPulling="2026-04-17 17:27:01.106217253 +0000 UTC m=+37.780070568" observedRunningTime="2026-04-17 17:27:02.122763692 +0000 UTC m=+38.796617029" watchObservedRunningTime="2026-04-17 17:27:02.123114564 +0000 UTC m=+38.796967899" Apr 17 17:27:04.487944 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.487907 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:27:04.488365 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.487953 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert\") pod \"ingress-canary-zflhg\" (UID: \"d55bc00d-c046-4764-9cfb-801efb7b23b8\") " pod="openshift-ingress-canary/ingress-canary-zflhg" Apr 17 17:27:04.488365 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:04.488055 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:27:04.488365 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:04.488078 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:27:04.488365 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:04.488125 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls podName:5f010296-632c-435e-b1db-62d8eeeae050 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:12.488107751 +0000 UTC m=+49.161961064 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls") pod "dns-default-rr5tw" (UID: "5f010296-632c-435e-b1db-62d8eeeae050") : secret "dns-default-metrics-tls" not found Apr 17 17:27:04.488365 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:04.488140 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert podName:d55bc00d-c046-4764-9cfb-801efb7b23b8 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:12.488133901 +0000 UTC m=+49.161987216 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert") pod "ingress-canary-zflhg" (UID: "d55bc00d-c046-4764-9cfb-801efb7b23b8") : secret "canary-serving-cert" not found Apr 17 17:27:04.839507 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.839418 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg"] Apr 17 17:27:04.842218 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.842197 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:04.845203 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.845180 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 17:27:04.845312 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.845214 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 17:27:04.846170 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.846145 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 17:27:04.846281 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.846228 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 17:27:04.846281 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.846236 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 17:27:04.846385 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.846322 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 17:27:04.846441 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.846427 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 17:27:04.853667 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.853643 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg"] Apr 17 17:27:04.991241 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.991201 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/004f4c3e-b34a-4576-8348-b548fc8d5729-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:04.991433 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.991308 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/004f4c3e-b34a-4576-8348-b548fc8d5729-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:04.991433 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.991353 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/004f4c3e-b34a-4576-8348-b548fc8d5729-ca\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:04.991433 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.991379 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/004f4c3e-b34a-4576-8348-b548fc8d5729-hub\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:04.991433 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.991395 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/004f4c3e-b34a-4576-8348-b548fc8d5729-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:04.991433 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:04.991417 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rmsm\" (UniqueName: \"kubernetes.io/projected/004f4c3e-b34a-4576-8348-b548fc8d5729-kube-api-access-8rmsm\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:05.092605 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:05.092531 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/004f4c3e-b34a-4576-8348-b548fc8d5729-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:05.092744 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:05.092619 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/004f4c3e-b34a-4576-8348-b548fc8d5729-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:05.092744 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:05.092651 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/004f4c3e-b34a-4576-8348-b548fc8d5729-ca\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:05.092744 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:05.092665 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/004f4c3e-b34a-4576-8348-b548fc8d5729-hub\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:05.092744 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:05.092688 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/004f4c3e-b34a-4576-8348-b548fc8d5729-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:05.092744 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:05.092723 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rmsm\" (UniqueName: \"kubernetes.io/projected/004f4c3e-b34a-4576-8348-b548fc8d5729-kube-api-access-8rmsm\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:05.093809 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:05.093782 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/004f4c3e-b34a-4576-8348-b548fc8d5729-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:05.098487 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:05.098461 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/004f4c3e-b34a-4576-8348-b548fc8d5729-ca\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:05.098683 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:05.098666 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/004f4c3e-b34a-4576-8348-b548fc8d5729-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:05.099163 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:05.099145 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/004f4c3e-b34a-4576-8348-b548fc8d5729-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:05.100956 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:05.100939 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rmsm\" (UniqueName: \"kubernetes.io/projected/004f4c3e-b34a-4576-8348-b548fc8d5729-kube-api-access-8rmsm\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:05.104070 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:05.104044 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/004f4c3e-b34a-4576-8348-b548fc8d5729-hub\") pod \"cluster-proxy-proxy-agent-7d859f5596-hkptg\" (UID: \"004f4c3e-b34a-4576-8348-b548fc8d5729\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:05.160084 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:05.160047 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:27:05.276020 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:05.275987 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg"] Apr 17 17:27:05.279081 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:27:05.279055 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod004f4c3e_b34a_4576_8348_b548fc8d5729.slice/crio-995f7e72848c181e56664774d826cc0630e13437f2fab088b963d413a629812a WatchSource:0}: Error finding container 995f7e72848c181e56664774d826cc0630e13437f2fab088b963d413a629812a: Status 404 returned error can't find the container with id 995f7e72848c181e56664774d826cc0630e13437f2fab088b963d413a629812a Apr 17 17:27:06.113784 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:06.113741 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" event={"ID":"004f4c3e-b34a-4576-8348-b548fc8d5729","Type":"ContainerStarted","Data":"995f7e72848c181e56664774d826cc0630e13437f2fab088b963d413a629812a"} Apr 17 17:27:09.120829 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:09.120795 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" event={"ID":"004f4c3e-b34a-4576-8348-b548fc8d5729","Type":"ContainerStarted","Data":"407eda175e7a8bd41ac0089b762f49a8a516282296fc2d841a14f91c2749e9b9"} Apr 17 17:27:11.126073 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:11.126048 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" event={"ID":"004f4c3e-b34a-4576-8348-b548fc8d5729","Type":"ContainerStarted","Data":"73eb926c98ba96b415745fbd2edc75395ff02fb465e97f027936de638e61aa1a"} Apr 17 17:27:12.129345 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:12.129298 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" event={"ID":"004f4c3e-b34a-4576-8348-b548fc8d5729","Type":"ContainerStarted","Data":"c67ef2ee1b69286fcd3690cadb6c25d9b433f572e7d047f00788d1968c4899fc"} Apr 17 17:27:12.151330 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:12.151285 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" podStartSLOduration=2.457729477 podStartE2EDuration="8.151273664s" podCreationTimestamp="2026-04-17 17:27:04 +0000 UTC" firstStartedPulling="2026-04-17 17:27:05.280833643 +0000 UTC m=+41.954686958" lastFinishedPulling="2026-04-17 17:27:10.974377826 +0000 UTC m=+47.648231145" observedRunningTime="2026-04-17 17:27:12.149907583 +0000 UTC m=+48.823760919" watchObservedRunningTime="2026-04-17 17:27:12.151273664 +0000 UTC m=+48.825127001" Apr 17 17:27:12.546192 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:12.546152 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:27:12.546349 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:12.546203 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert\") pod \"ingress-canary-zflhg\" (UID: \"d55bc00d-c046-4764-9cfb-801efb7b23b8\") " pod="openshift-ingress-canary/ingress-canary-zflhg" Apr 17 17:27:12.546349 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:12.546297 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:27:12.546349 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:12.546311 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:27:12.546453 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:12.546366 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls podName:5f010296-632c-435e-b1db-62d8eeeae050 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:28.54634874 +0000 UTC m=+65.220202055 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls") pod "dns-default-rr5tw" (UID: "5f010296-632c-435e-b1db-62d8eeeae050") : secret "dns-default-metrics-tls" not found Apr 17 17:27:12.546453 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:12.546381 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert podName:d55bc00d-c046-4764-9cfb-801efb7b23b8 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:28.54637473 +0000 UTC m=+65.220228045 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert") pod "ingress-canary-zflhg" (UID: "d55bc00d-c046-4764-9cfb-801efb7b23b8") : secret "canary-serving-cert" not found Apr 17 17:27:22.080462 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:22.080436 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz6mf" Apr 17 17:27:28.548753 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:28.548689 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:27:28.548753 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:28.548763 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert\") pod \"ingress-canary-zflhg\" (UID: \"d55bc00d-c046-4764-9cfb-801efb7b23b8\") " pod="openshift-ingress-canary/ingress-canary-zflhg" Apr 17 17:27:28.549182 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:28.548842 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:27:28.549182 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:28.548863 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:27:28.549182 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:28.548915 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls podName:5f010296-632c-435e-b1db-62d8eeeae050 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:00.548899555 +0000 UTC m=+97.222752869 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls") pod "dns-default-rr5tw" (UID: "5f010296-632c-435e-b1db-62d8eeeae050") : secret "dns-default-metrics-tls" not found Apr 17 17:27:28.549182 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:28.548931 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert podName:d55bc00d-c046-4764-9cfb-801efb7b23b8 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:00.548924977 +0000 UTC m=+97.222778290 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert") pod "ingress-canary-zflhg" (UID: "d55bc00d-c046-4764-9cfb-801efb7b23b8") : secret "canary-serving-cert" not found Apr 17 17:27:29.554773 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:29.554717 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs\") pod \"network-metrics-daemon-fczvt\" (UID: \"8f82208d-ba82-434a-a209-d69847b4e54b\") " pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:27:29.557767 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:29.557747 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:27:29.565039 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:29.565021 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:27:29.565122 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:27:29.565086 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs podName:8f82208d-ba82-434a-a209-d69847b4e54b nodeName:}" failed. No retries permitted until 2026-04-17 17:28:33.565066073 +0000 UTC m=+130.238919407 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs") pod "network-metrics-daemon-fczvt" (UID: "8f82208d-ba82-434a-a209-d69847b4e54b") : secret "metrics-daemon-secret" not found Apr 17 17:27:33.109510 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:27:33.109478 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-7m8gw" Apr 17 17:28:00.564690 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:00.564643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:28:00.564690 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:00.564715 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert\") pod \"ingress-canary-zflhg\" (UID: \"d55bc00d-c046-4764-9cfb-801efb7b23b8\") " pod="openshift-ingress-canary/ingress-canary-zflhg" Apr 17 17:28:00.565140 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:28:00.564796 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:28:00.565140 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:28:00.564799 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:28:00.565140 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:28:00.564870 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert podName:d55bc00d-c046-4764-9cfb-801efb7b23b8 nodeName:}" failed. No retries permitted until 2026-04-17 17:29:04.564854864 +0000 UTC m=+161.238708191 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert") pod "ingress-canary-zflhg" (UID: "d55bc00d-c046-4764-9cfb-801efb7b23b8") : secret "canary-serving-cert" not found Apr 17 17:28:00.565140 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:28:00.564882 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls podName:5f010296-632c-435e-b1db-62d8eeeae050 nodeName:}" failed. No retries permitted until 2026-04-17 17:29:04.564876693 +0000 UTC m=+161.238730008 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls") pod "dns-default-rr5tw" (UID: "5f010296-632c-435e-b1db-62d8eeeae050") : secret "dns-default-metrics-tls" not found Apr 17 17:28:27.931168 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:27.931137 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j"] Apr 17 17:28:27.933882 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:27.933867 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j" Apr 17 17:28:27.936594 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:27.936578 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 17:28:27.937230 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:27.937210 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:28:27.937345 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:27.937231 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 17:28:27.937596 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:27.937579 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 17:28:27.937947 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:27.937934 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-gplbb\"" Apr 17 17:28:27.949682 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:27.949659 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j"] Apr 17 17:28:28.048569 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:28.048539 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ba9623b-16db-45c3-b260-9df660f50aa1-serving-cert\") pod \"service-ca-operator-d6fc45fc5-sw58j\" (UID: \"0ba9623b-16db-45c3-b260-9df660f50aa1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j" Apr 17 17:28:28.048759 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:28.048577 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba9623b-16db-45c3-b260-9df660f50aa1-config\") pod \"service-ca-operator-d6fc45fc5-sw58j\" (UID: \"0ba9623b-16db-45c3-b260-9df660f50aa1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j" Apr 17 17:28:28.048759 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:28.048599 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk5dg\" (UniqueName: \"kubernetes.io/projected/0ba9623b-16db-45c3-b260-9df660f50aa1-kube-api-access-nk5dg\") pod \"service-ca-operator-d6fc45fc5-sw58j\" (UID: \"0ba9623b-16db-45c3-b260-9df660f50aa1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j" Apr 17 17:28:28.149814 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:28.149784 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk5dg\" (UniqueName: \"kubernetes.io/projected/0ba9623b-16db-45c3-b260-9df660f50aa1-kube-api-access-nk5dg\") pod \"service-ca-operator-d6fc45fc5-sw58j\" (UID: \"0ba9623b-16db-45c3-b260-9df660f50aa1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j" Apr 17 17:28:28.149925 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:28.149859 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ba9623b-16db-45c3-b260-9df660f50aa1-serving-cert\") pod \"service-ca-operator-d6fc45fc5-sw58j\" (UID: \"0ba9623b-16db-45c3-b260-9df660f50aa1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j" Apr 17 17:28:28.149925 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:28.149885 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba9623b-16db-45c3-b260-9df660f50aa1-config\") pod \"service-ca-operator-d6fc45fc5-sw58j\" (UID: \"0ba9623b-16db-45c3-b260-9df660f50aa1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j" Apr 17 17:28:28.150906 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:28.150887 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba9623b-16db-45c3-b260-9df660f50aa1-config\") pod \"service-ca-operator-d6fc45fc5-sw58j\" (UID: \"0ba9623b-16db-45c3-b260-9df660f50aa1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j" Apr 17 17:28:28.152139 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:28.152118 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ba9623b-16db-45c3-b260-9df660f50aa1-serving-cert\") pod \"service-ca-operator-d6fc45fc5-sw58j\" (UID: \"0ba9623b-16db-45c3-b260-9df660f50aa1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j" Apr 17 17:28:28.158466 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:28.158443 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk5dg\" (UniqueName: \"kubernetes.io/projected/0ba9623b-16db-45c3-b260-9df660f50aa1-kube-api-access-nk5dg\") pod \"service-ca-operator-d6fc45fc5-sw58j\" (UID: \"0ba9623b-16db-45c3-b260-9df660f50aa1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j" Apr 17 17:28:28.241725 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:28.241688 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j" Apr 17 17:28:28.352464 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:28.352438 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j"] Apr 17 17:28:28.355219 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:28:28.355194 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ba9623b_16db_45c3_b260_9df660f50aa1.slice/crio-955228f2422cd50fee39e4fe277542ad7d941c1ceab591a337cd1d166d57cbff WatchSource:0}: Error finding container 955228f2422cd50fee39e4fe277542ad7d941c1ceab591a337cd1d166d57cbff: Status 404 returned error can't find the container with id 955228f2422cd50fee39e4fe277542ad7d941c1ceab591a337cd1d166d57cbff Apr 17 17:28:29.277916 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:29.277880 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j" event={"ID":"0ba9623b-16db-45c3-b260-9df660f50aa1","Type":"ContainerStarted","Data":"955228f2422cd50fee39e4fe277542ad7d941c1ceab591a337cd1d166d57cbff"} Apr 17 17:28:30.281418 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:30.281384 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j" event={"ID":"0ba9623b-16db-45c3-b260-9df660f50aa1","Type":"ContainerStarted","Data":"e4e040159cb7d4f2e91f3ec671ed5300ec98fb73c60c39a6e726e40789994f7b"} Apr 17 17:28:30.299783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:30.299735 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j" podStartSLOduration=1.487002691 podStartE2EDuration="3.299720505s" podCreationTimestamp="2026-04-17 17:28:27 +0000 UTC" firstStartedPulling="2026-04-17 17:28:28.356835244 +0000 UTC m=+125.030688557" lastFinishedPulling="2026-04-17 17:28:30.169553054 +0000 UTC m=+126.843406371" observedRunningTime="2026-04-17 17:28:30.298786734 +0000 UTC m=+126.972640070" watchObservedRunningTime="2026-04-17 17:28:30.299720505 +0000 UTC m=+126.973573842" Apr 17 17:28:33.593650 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:33.593607 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs\") pod \"network-metrics-daemon-fczvt\" (UID: \"8f82208d-ba82-434a-a209-d69847b4e54b\") " pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:28:33.594167 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:28:33.593800 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:28:33.594167 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:28:33.593885 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs podName:8f82208d-ba82-434a-a209-d69847b4e54b nodeName:}" failed. No retries permitted until 2026-04-17 17:30:35.593863243 +0000 UTC m=+252.267716557 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs") pod "network-metrics-daemon-fczvt" (UID: "8f82208d-ba82-434a-a209-d69847b4e54b") : secret "metrics-daemon-secret" not found Apr 17 17:28:35.463429 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:35.463399 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zdmch_e5ac259a-b271-4b9e-9b77-a8a0b4118c19/dns-node-resolver/0.log" Apr 17 17:28:36.062655 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:36.062631 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rfmdv_09957151-db22-4dfb-9042-61ea1fff6d0b/node-ca/0.log" Apr 17 17:28:45.161292 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:45.161236 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" podUID="004f4c3e-b34a-4576-8348-b548fc8d5729" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 17:28:54.041168 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.041131 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-tp7vv"] Apr 17 17:28:54.044111 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.044091 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tp7vv" Apr 17 17:28:54.048214 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.048190 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 17:28:54.048319 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.048192 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 17:28:54.048319 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.048201 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-5nbsv\"" Apr 17 17:28:54.053779 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.053759 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-tp7vv"] Apr 17 17:28:54.075734 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.075691 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2kgc2"] Apr 17 17:28:54.078476 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.078462 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2kgc2" Apr 17 17:28:54.080992 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.080975 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:28:54.081194 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.081181 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:28:54.081278 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.081200 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:28:54.081278 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.081210 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:28:54.081278 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.081211 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jslpj\"" Apr 17 17:28:54.089691 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.089672 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2kgc2"] Apr 17 17:28:54.140421 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.140397 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fe1425c3-7bb7-4eea-8893-f2088028e46e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-tp7vv\" (UID: \"fe1425c3-7bb7-4eea-8893-f2088028e46e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tp7vv" Apr 17 17:28:54.140551 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.140506 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe1425c3-7bb7-4eea-8893-f2088028e46e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tp7vv\" (UID: \"fe1425c3-7bb7-4eea-8893-f2088028e46e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tp7vv" Apr 17 17:28:54.164601 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.164576 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-688d48f848-x5zg2"] Apr 17 17:28:54.167277 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.167261 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.172270 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.172251 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 17:28:54.172391 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.172331 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 17:28:54.172519 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.172500 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-pcbgd\"" Apr 17 17:28:54.172559 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.172500 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 17:28:54.186117 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.186098 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 17:28:54.201171 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.201150 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-688d48f848-x5zg2"] Apr 17 17:28:54.238600 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.238575 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-688d48f848-x5zg2"] Apr 17 17:28:54.238760 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:28:54.238740 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[bound-sa-token ca-trust-extracted image-registry-private-configuration installation-pull-secrets kube-api-access-29gcc registry-certificates registry-tls trusted-ca], unattached volumes=[], failed to process volumes=[bound-sa-token ca-trust-extracted image-registry-private-configuration installation-pull-secrets kube-api-access-29gcc registry-certificates registry-tls trusted-ca]: context canceled" pod="openshift-image-registry/image-registry-688d48f848-x5zg2" podUID="2494c2f9-377b-4e6f-97e1-a9e4159892ce" Apr 17 17:28:54.240877 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.240861 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe1425c3-7bb7-4eea-8893-f2088028e46e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tp7vv\" (UID: \"fe1425c3-7bb7-4eea-8893-f2088028e46e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tp7vv" Apr 17 17:28:54.240922 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.240894 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv8b4\" (UniqueName: \"kubernetes.io/projected/707dd6a1-ae31-44f2-b222-f3bb7a67e699-kube-api-access-jv8b4\") pod \"insights-runtime-extractor-2kgc2\" (UID: \"707dd6a1-ae31-44f2-b222-f3bb7a67e699\") " pod="openshift-insights/insights-runtime-extractor-2kgc2" Apr 17 17:28:54.240922 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.240914 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/707dd6a1-ae31-44f2-b222-f3bb7a67e699-crio-socket\") pod \"insights-runtime-extractor-2kgc2\" (UID: \"707dd6a1-ae31-44f2-b222-f3bb7a67e699\") " pod="openshift-insights/insights-runtime-extractor-2kgc2" Apr 17 17:28:54.240983 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.240935 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/707dd6a1-ae31-44f2-b222-f3bb7a67e699-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2kgc2\" (UID: \"707dd6a1-ae31-44f2-b222-f3bb7a67e699\") " pod="openshift-insights/insights-runtime-extractor-2kgc2" Apr 17 17:28:54.241082 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.241063 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fe1425c3-7bb7-4eea-8893-f2088028e46e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-tp7vv\" (UID: \"fe1425c3-7bb7-4eea-8893-f2088028e46e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tp7vv" Apr 17 17:28:54.241129 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.241118 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/707dd6a1-ae31-44f2-b222-f3bb7a67e699-data-volume\") pod \"insights-runtime-extractor-2kgc2\" (UID: \"707dd6a1-ae31-44f2-b222-f3bb7a67e699\") " pod="openshift-insights/insights-runtime-extractor-2kgc2" Apr 17 17:28:54.241202 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.241184 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/707dd6a1-ae31-44f2-b222-f3bb7a67e699-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2kgc2\" (UID: \"707dd6a1-ae31-44f2-b222-f3bb7a67e699\") " pod="openshift-insights/insights-runtime-extractor-2kgc2" Apr 17 17:28:54.241639 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.241616 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fe1425c3-7bb7-4eea-8893-f2088028e46e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-tp7vv\" (UID: \"fe1425c3-7bb7-4eea-8893-f2088028e46e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tp7vv" Apr 17 17:28:54.243850 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.243834 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe1425c3-7bb7-4eea-8893-f2088028e46e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tp7vv\" (UID: \"fe1425c3-7bb7-4eea-8893-f2088028e46e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tp7vv" Apr 17 17:28:54.326440 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.326378 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.330384 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.330365 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.342122 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.342103 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-registry-tls\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.342188 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.342133 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jv8b4\" (UniqueName: \"kubernetes.io/projected/707dd6a1-ae31-44f2-b222-f3bb7a67e699-kube-api-access-jv8b4\") pod \"insights-runtime-extractor-2kgc2\" (UID: \"707dd6a1-ae31-44f2-b222-f3bb7a67e699\") " pod="openshift-insights/insights-runtime-extractor-2kgc2" Apr 17 17:28:54.342188 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.342151 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2494c2f9-377b-4e6f-97e1-a9e4159892ce-installation-pull-secrets\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.342188 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.342173 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29gcc\" (UniqueName: \"kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-kube-api-access-29gcc\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.342374 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.342356 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2494c2f9-377b-4e6f-97e1-a9e4159892ce-ca-trust-extracted\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.342431 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.342387 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2494c2f9-377b-4e6f-97e1-a9e4159892ce-image-registry-private-configuration\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.342431 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.342411 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/707dd6a1-ae31-44f2-b222-f3bb7a67e699-data-volume\") pod \"insights-runtime-extractor-2kgc2\" (UID: \"707dd6a1-ae31-44f2-b222-f3bb7a67e699\") " pod="openshift-insights/insights-runtime-extractor-2kgc2" Apr 17 17:28:54.342508 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.342473 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/707dd6a1-ae31-44f2-b222-f3bb7a67e699-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2kgc2\" (UID: \"707dd6a1-ae31-44f2-b222-f3bb7a67e699\") " pod="openshift-insights/insights-runtime-extractor-2kgc2" Apr 17 17:28:54.342508 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.342498 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2494c2f9-377b-4e6f-97e1-a9e4159892ce-registry-certificates\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.342584 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.342522 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/707dd6a1-ae31-44f2-b222-f3bb7a67e699-crio-socket\") pod \"insights-runtime-extractor-2kgc2\" (UID: \"707dd6a1-ae31-44f2-b222-f3bb7a67e699\") " pod="openshift-insights/insights-runtime-extractor-2kgc2" Apr 17 17:28:54.342584 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.342545 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2494c2f9-377b-4e6f-97e1-a9e4159892ce-trusted-ca\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.342657 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.342589 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/707dd6a1-ae31-44f2-b222-f3bb7a67e699-crio-socket\") pod \"insights-runtime-extractor-2kgc2\" (UID: \"707dd6a1-ae31-44f2-b222-f3bb7a67e699\") " pod="openshift-insights/insights-runtime-extractor-2kgc2" Apr 17 17:28:54.342657 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.342620 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-bound-sa-token\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.342757 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.342662 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/707dd6a1-ae31-44f2-b222-f3bb7a67e699-data-volume\") pod \"insights-runtime-extractor-2kgc2\" (UID: \"707dd6a1-ae31-44f2-b222-f3bb7a67e699\") " pod="openshift-insights/insights-runtime-extractor-2kgc2" Apr 17 17:28:54.342757 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.342673 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/707dd6a1-ae31-44f2-b222-f3bb7a67e699-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2kgc2\" (UID: \"707dd6a1-ae31-44f2-b222-f3bb7a67e699\") " pod="openshift-insights/insights-runtime-extractor-2kgc2" Apr 17 17:28:54.343060 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.343030 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/707dd6a1-ae31-44f2-b222-f3bb7a67e699-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2kgc2\" (UID: \"707dd6a1-ae31-44f2-b222-f3bb7a67e699\") " pod="openshift-insights/insights-runtime-extractor-2kgc2" Apr 17 17:28:54.344650 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.344629 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/707dd6a1-ae31-44f2-b222-f3bb7a67e699-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2kgc2\" (UID: \"707dd6a1-ae31-44f2-b222-f3bb7a67e699\") " pod="openshift-insights/insights-runtime-extractor-2kgc2" Apr 17 17:28:54.350405 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.350385 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv8b4\" (UniqueName: \"kubernetes.io/projected/707dd6a1-ae31-44f2-b222-f3bb7a67e699-kube-api-access-jv8b4\") pod \"insights-runtime-extractor-2kgc2\" (UID: \"707dd6a1-ae31-44f2-b222-f3bb7a67e699\") " pod="openshift-insights/insights-runtime-extractor-2kgc2" Apr 17 17:28:54.352313 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.352296 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tp7vv" Apr 17 17:28:54.386539 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.386415 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2kgc2" Apr 17 17:28:54.444843 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.443537 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2494c2f9-377b-4e6f-97e1-a9e4159892ce-ca-trust-extracted\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.444843 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.443590 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2494c2f9-377b-4e6f-97e1-a9e4159892ce-image-registry-private-configuration\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.444843 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.443666 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2494c2f9-377b-4e6f-97e1-a9e4159892ce-registry-certificates\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.444843 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.443722 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2494c2f9-377b-4e6f-97e1-a9e4159892ce-trusted-ca\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.444843 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.443749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-bound-sa-token\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.444843 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.443797 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-registry-tls\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.444843 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.443825 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2494c2f9-377b-4e6f-97e1-a9e4159892ce-installation-pull-secrets\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.444843 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.443856 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29gcc\" (UniqueName: \"kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-kube-api-access-29gcc\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.444843 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.444361 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2494c2f9-377b-4e6f-97e1-a9e4159892ce-ca-trust-extracted\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.445376 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.445087 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2494c2f9-377b-4e6f-97e1-a9e4159892ce-registry-certificates\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.445376 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.445159 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2494c2f9-377b-4e6f-97e1-a9e4159892ce-trusted-ca\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.448247 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.448201 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2494c2f9-377b-4e6f-97e1-a9e4159892ce-image-registry-private-configuration\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.448917 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.448874 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2494c2f9-377b-4e6f-97e1-a9e4159892ce-installation-pull-secrets\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.449446 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.449427 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-registry-tls\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.455480 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.455445 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29gcc\" (UniqueName: \"kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-kube-api-access-29gcc\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.456857 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.456838 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-bound-sa-token\") pod \"image-registry-688d48f848-x5zg2\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:54.481739 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.481688 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-tp7vv"] Apr 17 17:28:54.485479 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:28:54.485432 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe1425c3_7bb7_4eea_8893_f2088028e46e.slice/crio-343302997a7a063c1043f7d842eb0c398195c97fdceb3cc5f32b2d6c96db3e68 WatchSource:0}: Error finding container 343302997a7a063c1043f7d842eb0c398195c97fdceb3cc5f32b2d6c96db3e68: Status 404 returned error can't find the container with id 343302997a7a063c1043f7d842eb0c398195c97fdceb3cc5f32b2d6c96db3e68 Apr 17 17:28:54.520500 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.520475 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2kgc2"] Apr 17 17:28:54.522836 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:28:54.522815 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod707dd6a1_ae31_44f2_b222_f3bb7a67e699.slice/crio-437a3f543084a14be01d189e41f614690d8fc63b701975cab5b0adce798e736a WatchSource:0}: Error finding container 437a3f543084a14be01d189e41f614690d8fc63b701975cab5b0adce798e736a: Status 404 returned error can't find the container with id 437a3f543084a14be01d189e41f614690d8fc63b701975cab5b0adce798e736a Apr 17 17:28:54.545094 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.545074 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2494c2f9-377b-4e6f-97e1-a9e4159892ce-registry-certificates\") pod \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " Apr 17 17:28:54.545176 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.545121 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2494c2f9-377b-4e6f-97e1-a9e4159892ce-trusted-ca\") pod \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " Apr 17 17:28:54.545214 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.545177 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2494c2f9-377b-4e6f-97e1-a9e4159892ce-ca-trust-extracted\") pod \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " Apr 17 17:28:54.545463 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.545442 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2494c2f9-377b-4e6f-97e1-a9e4159892ce-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2494c2f9-377b-4e6f-97e1-a9e4159892ce" (UID: "2494c2f9-377b-4e6f-97e1-a9e4159892ce"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:28:54.545463 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.545454 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2494c2f9-377b-4e6f-97e1-a9e4159892ce-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2494c2f9-377b-4e6f-97e1-a9e4159892ce" (UID: "2494c2f9-377b-4e6f-97e1-a9e4159892ce"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:54.545531 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.545516 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2494c2f9-377b-4e6f-97e1-a9e4159892ce-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2494c2f9-377b-4e6f-97e1-a9e4159892ce" (UID: "2494c2f9-377b-4e6f-97e1-a9e4159892ce"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:54.645941 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.645914 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29gcc\" (UniqueName: \"kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-kube-api-access-29gcc\") pod \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " Apr 17 17:28:54.646089 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.645953 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-bound-sa-token\") pod \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " Apr 17 17:28:54.646089 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.645977 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2494c2f9-377b-4e6f-97e1-a9e4159892ce-installation-pull-secrets\") pod \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " Apr 17 17:28:54.646089 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.646000 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-registry-tls\") pod \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " Apr 17 17:28:54.646089 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.646040 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2494c2f9-377b-4e6f-97e1-a9e4159892ce-image-registry-private-configuration\") pod \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\" (UID: \"2494c2f9-377b-4e6f-97e1-a9e4159892ce\") " Apr 17 17:28:54.646230 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.646193 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2494c2f9-377b-4e6f-97e1-a9e4159892ce-registry-certificates\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:28:54.646230 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.646210 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2494c2f9-377b-4e6f-97e1-a9e4159892ce-trusted-ca\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:28:54.646230 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.646226 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2494c2f9-377b-4e6f-97e1-a9e4159892ce-ca-trust-extracted\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:28:54.648112 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.648084 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2494c2f9-377b-4e6f-97e1-a9e4159892ce-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "2494c2f9-377b-4e6f-97e1-a9e4159892ce" (UID: "2494c2f9-377b-4e6f-97e1-a9e4159892ce"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:54.648220 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.648117 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-kube-api-access-29gcc" (OuterVolumeSpecName: "kube-api-access-29gcc") pod "2494c2f9-377b-4e6f-97e1-a9e4159892ce" (UID: "2494c2f9-377b-4e6f-97e1-a9e4159892ce"). InnerVolumeSpecName "kube-api-access-29gcc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:54.648220 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.648134 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2494c2f9-377b-4e6f-97e1-a9e4159892ce" (UID: "2494c2f9-377b-4e6f-97e1-a9e4159892ce"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:54.648385 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.648365 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2494c2f9-377b-4e6f-97e1-a9e4159892ce-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2494c2f9-377b-4e6f-97e1-a9e4159892ce" (UID: "2494c2f9-377b-4e6f-97e1-a9e4159892ce"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:54.648385 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.648366 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2494c2f9-377b-4e6f-97e1-a9e4159892ce" (UID: "2494c2f9-377b-4e6f-97e1-a9e4159892ce"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:54.747475 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.747446 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-29gcc\" (UniqueName: \"kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-kube-api-access-29gcc\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:28:54.747475 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.747471 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-bound-sa-token\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:28:54.747634 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.747481 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2494c2f9-377b-4e6f-97e1-a9e4159892ce-installation-pull-secrets\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:28:54.747634 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.747492 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2494c2f9-377b-4e6f-97e1-a9e4159892ce-registry-tls\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:28:54.747634 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:54.747501 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2494c2f9-377b-4e6f-97e1-a9e4159892ce-image-registry-private-configuration\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:28:55.161791 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:55.161746 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" podUID="004f4c3e-b34a-4576-8348-b548fc8d5729" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 17:28:55.330690 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:55.330655 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2kgc2" event={"ID":"707dd6a1-ae31-44f2-b222-f3bb7a67e699","Type":"ContainerStarted","Data":"d78c8b13f1aa41d67729408464d42a43635c8c5042429727cf686ec532ca93b1"} Apr 17 17:28:55.330813 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:55.330710 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2kgc2" event={"ID":"707dd6a1-ae31-44f2-b222-f3bb7a67e699","Type":"ContainerStarted","Data":"6aa23d9edfc68e777644fdf16d70c8e1027464e88826833c6a847fe88d9e5c92"} Apr 17 17:28:55.330813 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:55.330726 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2kgc2" event={"ID":"707dd6a1-ae31-44f2-b222-f3bb7a67e699","Type":"ContainerStarted","Data":"437a3f543084a14be01d189e41f614690d8fc63b701975cab5b0adce798e736a"} Apr 17 17:28:55.331845 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:55.331819 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tp7vv" event={"ID":"fe1425c3-7bb7-4eea-8893-f2088028e46e","Type":"ContainerStarted","Data":"343302997a7a063c1043f7d842eb0c398195c97fdceb3cc5f32b2d6c96db3e68"} Apr 17 17:28:55.332005 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:55.331859 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-688d48f848-x5zg2" Apr 17 17:28:55.366487 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:55.366459 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-688d48f848-x5zg2"] Apr 17 17:28:55.374624 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:55.373242 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-688d48f848-x5zg2"] Apr 17 17:28:55.916223 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:55.916118 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2494c2f9-377b-4e6f-97e1-a9e4159892ce" path="/var/lib/kubelet/pods/2494c2f9-377b-4e6f-97e1-a9e4159892ce/volumes" Apr 17 17:28:56.335638 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:56.335606 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tp7vv" event={"ID":"fe1425c3-7bb7-4eea-8893-f2088028e46e","Type":"ContainerStarted","Data":"d43e6555565459bb270d150e9e79098db9b50646a38d7a89289033ebf352cf3a"} Apr 17 17:28:56.352102 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:56.352033 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tp7vv" podStartSLOduration=1.21650255 podStartE2EDuration="2.352013742s" podCreationTimestamp="2026-04-17 17:28:54 +0000 UTC" firstStartedPulling="2026-04-17 17:28:54.487524299 +0000 UTC m=+151.161377613" lastFinishedPulling="2026-04-17 17:28:55.623035488 +0000 UTC m=+152.296888805" observedRunningTime="2026-04-17 17:28:56.351341567 +0000 UTC m=+153.025194904" watchObservedRunningTime="2026-04-17 17:28:56.352013742 +0000 UTC m=+153.025867082" Apr 17 17:28:57.339451 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:57.339416 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2kgc2" event={"ID":"707dd6a1-ae31-44f2-b222-f3bb7a67e699","Type":"ContainerStarted","Data":"58ca5634055cb79bc21d80108330f8490267644ea6f0f99829fb1b98642e31d2"} Apr 17 17:28:57.358193 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:28:57.358147 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2kgc2" podStartSLOduration=1.154634036 podStartE2EDuration="3.358134224s" podCreationTimestamp="2026-04-17 17:28:54 +0000 UTC" firstStartedPulling="2026-04-17 17:28:54.575942929 +0000 UTC m=+151.249796243" lastFinishedPulling="2026-04-17 17:28:56.779443112 +0000 UTC m=+153.453296431" observedRunningTime="2026-04-17 17:28:57.357736687 +0000 UTC m=+154.031590023" watchObservedRunningTime="2026-04-17 17:28:57.358134224 +0000 UTC m=+154.031987561" Apr 17 17:28:59.692045 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:28:59.692002 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-rr5tw" podUID="5f010296-632c-435e-b1db-62d8eeeae050" Apr 17 17:28:59.698111 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:28:59.698083 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zflhg" podUID="d55bc00d-c046-4764-9cfb-801efb7b23b8" Apr 17 17:29:00.346827 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:00.346748 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zflhg" Apr 17 17:29:00.346827 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:00.346795 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rr5tw" Apr 17 17:29:00.922907 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:29:00.922861 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-fczvt" podUID="8f82208d-ba82-434a-a209-d69847b4e54b" Apr 17 17:29:02.182397 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:02.182364 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-25d7j"] Apr 17 17:29:02.185460 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:02.185444 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-25d7j" Apr 17 17:29:02.187882 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:02.187865 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 17:29:02.188155 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:02.188141 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-cxb5k\"" Apr 17 17:29:02.191766 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:02.191667 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-25d7j"] Apr 17 17:29:02.195776 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:02.195752 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/99b4f532-1014-4577-9b72-1fbd93101e6a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-25d7j\" (UID: \"99b4f532-1014-4577-9b72-1fbd93101e6a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-25d7j" Apr 17 17:29:02.296660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:02.296622 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/99b4f532-1014-4577-9b72-1fbd93101e6a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-25d7j\" (UID: \"99b4f532-1014-4577-9b72-1fbd93101e6a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-25d7j" Apr 17 17:29:02.296844 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:29:02.296776 2571 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 17 17:29:02.296844 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:29:02.296836 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99b4f532-1014-4577-9b72-1fbd93101e6a-tls-certificates podName:99b4f532-1014-4577-9b72-1fbd93101e6a nodeName:}" failed. No retries permitted until 2026-04-17 17:29:02.796821695 +0000 UTC m=+159.470675009 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/99b4f532-1014-4577-9b72-1fbd93101e6a-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-25d7j" (UID: "99b4f532-1014-4577-9b72-1fbd93101e6a") : secret "prometheus-operator-admission-webhook-tls" not found Apr 17 17:29:02.799298 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:02.799258 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/99b4f532-1014-4577-9b72-1fbd93101e6a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-25d7j\" (UID: \"99b4f532-1014-4577-9b72-1fbd93101e6a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-25d7j" Apr 17 17:29:02.799476 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:29:02.799393 2571 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 17 17:29:02.799476 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:29:02.799457 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99b4f532-1014-4577-9b72-1fbd93101e6a-tls-certificates podName:99b4f532-1014-4577-9b72-1fbd93101e6a nodeName:}" failed. No retries permitted until 2026-04-17 17:29:03.799441243 +0000 UTC m=+160.473294557 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/99b4f532-1014-4577-9b72-1fbd93101e6a-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-25d7j" (UID: "99b4f532-1014-4577-9b72-1fbd93101e6a") : secret "prometheus-operator-admission-webhook-tls" not found Apr 17 17:29:03.805640 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:03.805593 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/99b4f532-1014-4577-9b72-1fbd93101e6a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-25d7j\" (UID: \"99b4f532-1014-4577-9b72-1fbd93101e6a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-25d7j" Apr 17 17:29:03.807894 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:03.807875 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/99b4f532-1014-4577-9b72-1fbd93101e6a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-25d7j\" (UID: \"99b4f532-1014-4577-9b72-1fbd93101e6a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-25d7j" Apr 17 17:29:03.994757 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:03.994720 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-25d7j" Apr 17 17:29:04.105351 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:04.105312 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-25d7j"] Apr 17 17:29:04.108716 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:29:04.108674 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99b4f532_1014_4577_9b72_1fbd93101e6a.slice/crio-4ff6b340427fcac0e774269af06413c137850149bbd6e69c34c8f6f80a8cac54 WatchSource:0}: Error finding container 4ff6b340427fcac0e774269af06413c137850149bbd6e69c34c8f6f80a8cac54: Status 404 returned error can't find the container with id 4ff6b340427fcac0e774269af06413c137850149bbd6e69c34c8f6f80a8cac54 Apr 17 17:29:04.357681 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:04.357597 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-25d7j" event={"ID":"99b4f532-1014-4577-9b72-1fbd93101e6a","Type":"ContainerStarted","Data":"4ff6b340427fcac0e774269af06413c137850149bbd6e69c34c8f6f80a8cac54"} Apr 17 17:29:04.611469 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:04.611389 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:29:04.611469 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:04.611429 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert\") pod \"ingress-canary-zflhg\" (UID: \"d55bc00d-c046-4764-9cfb-801efb7b23b8\") " pod="openshift-ingress-canary/ingress-canary-zflhg" Apr 17 17:29:04.613776 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:04.613750 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f010296-632c-435e-b1db-62d8eeeae050-metrics-tls\") pod \"dns-default-rr5tw\" (UID: \"5f010296-632c-435e-b1db-62d8eeeae050\") " pod="openshift-dns/dns-default-rr5tw" Apr 17 17:29:04.613918 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:04.613899 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d55bc00d-c046-4764-9cfb-801efb7b23b8-cert\") pod \"ingress-canary-zflhg\" (UID: \"d55bc00d-c046-4764-9cfb-801efb7b23b8\") " pod="openshift-ingress-canary/ingress-canary-zflhg" Apr 17 17:29:04.850407 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:04.850374 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-29fk4\"" Apr 17 17:29:04.851496 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:04.851476 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dxpw7\"" Apr 17 17:29:04.858612 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:04.858586 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zflhg" Apr 17 17:29:04.858612 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:04.858602 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rr5tw" Apr 17 17:29:04.986073 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:04.986047 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zflhg"] Apr 17 17:29:05.008491 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:05.008462 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rr5tw"] Apr 17 17:29:05.161417 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:05.161302 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" podUID="004f4c3e-b34a-4576-8348-b548fc8d5729" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 17:29:05.161417 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:05.161386 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" Apr 17 17:29:05.162069 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:05.162033 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"c67ef2ee1b69286fcd3690cadb6c25d9b433f572e7d047f00788d1968c4899fc"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 17:29:05.162172 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:05.162104 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" podUID="004f4c3e-b34a-4576-8348-b548fc8d5729" containerName="service-proxy" containerID="cri-o://c67ef2ee1b69286fcd3690cadb6c25d9b433f572e7d047f00788d1968c4899fc" gracePeriod=30 Apr 17 17:29:05.193269 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:29:05.193236 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd55bc00d_c046_4764_9cfb_801efb7b23b8.slice/crio-496e366cd8ead1f161c516bd603aa41d43ce0065398a773aa3b0970cfb01194c WatchSource:0}: Error finding container 496e366cd8ead1f161c516bd603aa41d43ce0065398a773aa3b0970cfb01194c: Status 404 returned error can't find the container with id 496e366cd8ead1f161c516bd603aa41d43ce0065398a773aa3b0970cfb01194c Apr 17 17:29:05.193914 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:29:05.193893 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f010296_632c_435e_b1db_62d8eeeae050.slice/crio-03351bc09c8b6f9988d44da12305d443cade1b3f2596b90ebae89f6b99eb7818 WatchSource:0}: Error finding container 03351bc09c8b6f9988d44da12305d443cade1b3f2596b90ebae89f6b99eb7818: Status 404 returned error can't find the container with id 03351bc09c8b6f9988d44da12305d443cade1b3f2596b90ebae89f6b99eb7818 Apr 17 17:29:05.361545 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:05.361509 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rr5tw" event={"ID":"5f010296-632c-435e-b1db-62d8eeeae050","Type":"ContainerStarted","Data":"03351bc09c8b6f9988d44da12305d443cade1b3f2596b90ebae89f6b99eb7818"} Apr 17 17:29:05.363378 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:05.363355 2571 generic.go:358] "Generic (PLEG): container finished" podID="004f4c3e-b34a-4576-8348-b548fc8d5729" containerID="c67ef2ee1b69286fcd3690cadb6c25d9b433f572e7d047f00788d1968c4899fc" exitCode=2 Apr 17 17:29:05.363499 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:05.363423 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" event={"ID":"004f4c3e-b34a-4576-8348-b548fc8d5729","Type":"ContainerDied","Data":"c67ef2ee1b69286fcd3690cadb6c25d9b433f572e7d047f00788d1968c4899fc"} Apr 17 17:29:05.363499 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:05.363456 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d859f5596-hkptg" event={"ID":"004f4c3e-b34a-4576-8348-b548fc8d5729","Type":"ContainerStarted","Data":"30ec1dace235d7e724151bee2b815cff46980344dd2b2aea0774876d213beab4"} Apr 17 17:29:05.364635 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:05.364611 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-25d7j" event={"ID":"99b4f532-1014-4577-9b72-1fbd93101e6a","Type":"ContainerStarted","Data":"89c79eb53c2690e95e4ad91fd21db658060b1442d34f781d0b1cdcfae485f0d7"} Apr 17 17:29:05.364849 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:05.364821 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-25d7j" Apr 17 17:29:05.368602 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:05.368574 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zflhg" event={"ID":"d55bc00d-c046-4764-9cfb-801efb7b23b8","Type":"ContainerStarted","Data":"496e366cd8ead1f161c516bd603aa41d43ce0065398a773aa3b0970cfb01194c"} Apr 17 17:29:05.371501 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:05.371484 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-25d7j" Apr 17 17:29:05.398262 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:05.398217 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-25d7j" podStartSLOduration=2.269850117 podStartE2EDuration="3.398205155s" podCreationTimestamp="2026-04-17 17:29:02 +0000 UTC" firstStartedPulling="2026-04-17 17:29:04.110291841 +0000 UTC m=+160.784145154" lastFinishedPulling="2026-04-17 17:29:05.238646877 +0000 UTC m=+161.912500192" observedRunningTime="2026-04-17 17:29:05.397329 +0000 UTC m=+162.071182349" watchObservedRunningTime="2026-04-17 17:29:05.398205155 +0000 UTC m=+162.072058481" Apr 17 17:29:06.259329 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.259294 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vlqmq"] Apr 17 17:29:06.262850 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.262821 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" Apr 17 17:29:06.265951 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.265926 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 17:29:06.266584 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.266562 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-w2mh2\"" Apr 17 17:29:06.267575 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.267553 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:29:06.268025 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.268003 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 17:29:06.268025 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.268017 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:29:06.268186 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.268062 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:29:06.274017 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.273984 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vlqmq"] Apr 17 17:29:06.322740 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.322693 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hz9m\" (UniqueName: \"kubernetes.io/projected/b7058f2c-5b01-4ef0-a631-06472e24ae23-kube-api-access-6hz9m\") pod \"prometheus-operator-5676c8c784-vlqmq\" (UID: \"b7058f2c-5b01-4ef0-a631-06472e24ae23\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" Apr 17 17:29:06.322898 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.322771 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7058f2c-5b01-4ef0-a631-06472e24ae23-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vlqmq\" (UID: \"b7058f2c-5b01-4ef0-a631-06472e24ae23\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" Apr 17 17:29:06.322898 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.322791 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7058f2c-5b01-4ef0-a631-06472e24ae23-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vlqmq\" (UID: \"b7058f2c-5b01-4ef0-a631-06472e24ae23\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" Apr 17 17:29:06.322898 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.322818 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7058f2c-5b01-4ef0-a631-06472e24ae23-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vlqmq\" (UID: \"b7058f2c-5b01-4ef0-a631-06472e24ae23\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" Apr 17 17:29:06.423716 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.423671 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hz9m\" (UniqueName: \"kubernetes.io/projected/b7058f2c-5b01-4ef0-a631-06472e24ae23-kube-api-access-6hz9m\") pod \"prometheus-operator-5676c8c784-vlqmq\" (UID: \"b7058f2c-5b01-4ef0-a631-06472e24ae23\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" Apr 17 17:29:06.423892 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.423749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7058f2c-5b01-4ef0-a631-06472e24ae23-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vlqmq\" (UID: \"b7058f2c-5b01-4ef0-a631-06472e24ae23\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" Apr 17 17:29:06.423892 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.423775 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7058f2c-5b01-4ef0-a631-06472e24ae23-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vlqmq\" (UID: \"b7058f2c-5b01-4ef0-a631-06472e24ae23\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" Apr 17 17:29:06.423892 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.423809 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7058f2c-5b01-4ef0-a631-06472e24ae23-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vlqmq\" (UID: \"b7058f2c-5b01-4ef0-a631-06472e24ae23\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" Apr 17 17:29:06.424067 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:29:06.424026 2571 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 17:29:06.424119 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:29:06.424102 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7058f2c-5b01-4ef0-a631-06472e24ae23-prometheus-operator-tls podName:b7058f2c-5b01-4ef0-a631-06472e24ae23 nodeName:}" failed. No retries permitted until 2026-04-17 17:29:06.92408045 +0000 UTC m=+163.597933766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/b7058f2c-5b01-4ef0-a631-06472e24ae23-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-vlqmq" (UID: "b7058f2c-5b01-4ef0-a631-06472e24ae23") : secret "prometheus-operator-tls" not found Apr 17 17:29:06.424538 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.424516 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7058f2c-5b01-4ef0-a631-06472e24ae23-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vlqmq\" (UID: \"b7058f2c-5b01-4ef0-a631-06472e24ae23\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" Apr 17 17:29:06.426477 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.426458 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7058f2c-5b01-4ef0-a631-06472e24ae23-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vlqmq\" (UID: \"b7058f2c-5b01-4ef0-a631-06472e24ae23\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" Apr 17 17:29:06.433124 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.433095 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hz9m\" (UniqueName: \"kubernetes.io/projected/b7058f2c-5b01-4ef0-a631-06472e24ae23-kube-api-access-6hz9m\") pod \"prometheus-operator-5676c8c784-vlqmq\" (UID: \"b7058f2c-5b01-4ef0-a631-06472e24ae23\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" Apr 17 17:29:06.928032 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.927995 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7058f2c-5b01-4ef0-a631-06472e24ae23-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vlqmq\" (UID: \"b7058f2c-5b01-4ef0-a631-06472e24ae23\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" Apr 17 17:29:06.930573 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:06.930549 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7058f2c-5b01-4ef0-a631-06472e24ae23-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vlqmq\" (UID: \"b7058f2c-5b01-4ef0-a631-06472e24ae23\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" Apr 17 17:29:07.174854 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:07.174818 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" Apr 17 17:29:07.319237 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:07.319202 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vlqmq"] Apr 17 17:29:07.322068 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:29:07.322032 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7058f2c_5b01_4ef0_a631_06472e24ae23.slice/crio-fa105482b65503f558aa9df2523e2bf123ece2f8af6d84e1559f0faac5738a79 WatchSource:0}: Error finding container fa105482b65503f558aa9df2523e2bf123ece2f8af6d84e1559f0faac5738a79: Status 404 returned error can't find the container with id fa105482b65503f558aa9df2523e2bf123ece2f8af6d84e1559f0faac5738a79 Apr 17 17:29:07.376720 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:07.376664 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zflhg" event={"ID":"d55bc00d-c046-4764-9cfb-801efb7b23b8","Type":"ContainerStarted","Data":"c44dffc7d268b3ba64220808dca01fc50b96678404b2f30e0160b53456d5fec4"} Apr 17 17:29:07.378297 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:07.378270 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rr5tw" event={"ID":"5f010296-632c-435e-b1db-62d8eeeae050","Type":"ContainerStarted","Data":"aa72eea18b83e56e8374aaf0e876b64b7838cfdb38deca2c12a8d31213dea942"} Apr 17 17:29:07.378297 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:07.378295 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rr5tw" event={"ID":"5f010296-632c-435e-b1db-62d8eeeae050","Type":"ContainerStarted","Data":"ef98758c583bdc4f618979965f6b94a0f580fe8a118772a96ac70ec3a6b755aa"} Apr 17 17:29:07.378474 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:07.378398 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rr5tw" Apr 17 17:29:07.379360 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:07.379341 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" event={"ID":"b7058f2c-5b01-4ef0-a631-06472e24ae23","Type":"ContainerStarted","Data":"fa105482b65503f558aa9df2523e2bf123ece2f8af6d84e1559f0faac5738a79"} Apr 17 17:29:07.394350 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:07.394314 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zflhg" podStartSLOduration=129.676828988 podStartE2EDuration="2m11.394301467s" podCreationTimestamp="2026-04-17 17:26:56 +0000 UTC" firstStartedPulling="2026-04-17 17:29:05.271764755 +0000 UTC m=+161.945618075" lastFinishedPulling="2026-04-17 17:29:06.989237237 +0000 UTC m=+163.663090554" observedRunningTime="2026-04-17 17:29:07.393200024 +0000 UTC m=+164.067053360" watchObservedRunningTime="2026-04-17 17:29:07.394301467 +0000 UTC m=+164.068154804" Apr 17 17:29:09.386358 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:09.386326 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" event={"ID":"b7058f2c-5b01-4ef0-a631-06472e24ae23","Type":"ContainerStarted","Data":"19db7330b17a8e4cbaa993bddd72396aa78755e7a9de64ece01281b754018bab"} Apr 17 17:29:09.386358 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:09.386360 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" event={"ID":"b7058f2c-5b01-4ef0-a631-06472e24ae23","Type":"ContainerStarted","Data":"a675e72ca32278384969a45974153e744e1e57697b39567185911260e80a09d8"} Apr 17 17:29:09.404374 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:09.404335 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-vlqmq" podStartSLOduration=2.2800862 podStartE2EDuration="3.404324327s" podCreationTimestamp="2026-04-17 17:29:06 +0000 UTC" firstStartedPulling="2026-04-17 17:29:07.323804115 +0000 UTC m=+163.997657429" lastFinishedPulling="2026-04-17 17:29:08.448042242 +0000 UTC m=+165.121895556" observedRunningTime="2026-04-17 17:29:09.402790907 +0000 UTC m=+166.076644244" watchObservedRunningTime="2026-04-17 17:29:09.404324327 +0000 UTC m=+166.078177663" Apr 17 17:29:09.404829 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:09.404804 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rr5tw" podStartSLOduration=131.690448868 podStartE2EDuration="2m13.404798326s" podCreationTimestamp="2026-04-17 17:26:56 +0000 UTC" firstStartedPulling="2026-04-17 17:29:05.271531798 +0000 UTC m=+161.945385119" lastFinishedPulling="2026-04-17 17:29:06.985881262 +0000 UTC m=+163.659734577" observedRunningTime="2026-04-17 17:29:07.409195937 +0000 UTC m=+164.083049274" watchObservedRunningTime="2026-04-17 17:29:09.404798326 +0000 UTC m=+166.078651661" Apr 17 17:29:11.671758 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.671726 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-5qcf7"] Apr 17 17:29:11.674839 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.674824 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.677348 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.677326 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:29:11.677475 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.677358 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:29:11.677475 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.677415 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:29:11.677475 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.677455 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2tjqb\"" Apr 17 17:29:11.760880 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.760856 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-tls\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.760993 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.760889 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd4675a1-0ff5-4c27-a0da-329554311931-metrics-client-ca\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.760993 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.760973 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-wtmp\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.761089 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.761010 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/dd4675a1-0ff5-4c27-a0da-329554311931-root\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.761089 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.761031 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-accelerators-collector-config\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.761089 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.761049 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd4675a1-0ff5-4c27-a0da-329554311931-sys\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.761182 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.761091 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.761182 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.761120 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c6w2\" (UniqueName: \"kubernetes.io/projected/dd4675a1-0ff5-4c27-a0da-329554311931-kube-api-access-8c6w2\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.761182 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.761140 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-textfile\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.862333 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.862301 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.862445 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.862339 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8c6w2\" (UniqueName: \"kubernetes.io/projected/dd4675a1-0ff5-4c27-a0da-329554311931-kube-api-access-8c6w2\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.862445 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.862368 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-textfile\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.862445 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.862396 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-tls\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.862445 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.862417 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd4675a1-0ff5-4c27-a0da-329554311931-metrics-client-ca\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.862595 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.862473 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-wtmp\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.862595 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.862508 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/dd4675a1-0ff5-4c27-a0da-329554311931-root\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.862595 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.862533 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-accelerators-collector-config\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.862595 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:29:11.862551 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:29:11.862803 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.862600 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd4675a1-0ff5-4c27-a0da-329554311931-sys\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.862803 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:29:11.862615 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-tls podName:dd4675a1-0ff5-4c27-a0da-329554311931 nodeName:}" failed. No retries permitted until 2026-04-17 17:29:12.362594885 +0000 UTC m=+169.036448210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-tls") pod "node-exporter-5qcf7" (UID: "dd4675a1-0ff5-4c27-a0da-329554311931") : secret "node-exporter-tls" not found Apr 17 17:29:11.862803 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.862561 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd4675a1-0ff5-4c27-a0da-329554311931-sys\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.862803 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.862635 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/dd4675a1-0ff5-4c27-a0da-329554311931-root\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.862803 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.862730 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-wtmp\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.863049 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.862823 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-textfile\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.863103 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.863087 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-accelerators-collector-config\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.863136 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.863093 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd4675a1-0ff5-4c27-a0da-329554311931-metrics-client-ca\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.864876 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.864858 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:11.870913 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:11.870893 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c6w2\" (UniqueName: \"kubernetes.io/projected/dd4675a1-0ff5-4c27-a0da-329554311931-kube-api-access-8c6w2\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:12.367023 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.366994 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-tls\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:12.369128 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.369104 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dd4675a1-0ff5-4c27-a0da-329554311931-node-exporter-tls\") pod \"node-exporter-5qcf7\" (UID: \"dd4675a1-0ff5-4c27-a0da-329554311931\") " pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:12.583838 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.583807 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5qcf7" Apr 17 17:29:12.591319 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:29:12.591292 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd4675a1_0ff5_4c27_a0da_329554311931.slice/crio-b326c45b068340ba1187f5eb1338c48e655343f666f300752d95d6711910f2c7 WatchSource:0}: Error finding container b326c45b068340ba1187f5eb1338c48e655343f666f300752d95d6711910f2c7: Status 404 returned error can't find the container with id b326c45b068340ba1187f5eb1338c48e655343f666f300752d95d6711910f2c7 Apr 17 17:29:12.704480 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.704402 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:29:12.710669 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.710646 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.713378 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.713356 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-fhgjs\"" Apr 17 17:29:12.713378 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.713364 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 17:29:12.713555 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.713428 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 17:29:12.713555 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.713443 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 17:29:12.713660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.713622 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 17:29:12.713660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.713638 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 17:29:12.713911 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.713893 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 17:29:12.713988 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.713912 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 17:29:12.713988 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.713901 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 17:29:12.714187 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.714172 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 17:29:12.722180 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.722160 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:29:12.769560 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.769531 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.769679 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.769578 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e82510c1-9a31-40fe-8c55-1b590f89ea23-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.769679 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.769600 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e82510c1-9a31-40fe-8c55-1b590f89ea23-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.769679 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.769622 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e82510c1-9a31-40fe-8c55-1b590f89ea23-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.769679 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.769649 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e82510c1-9a31-40fe-8c55-1b590f89ea23-config-out\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.769679 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.769671 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-config-volume\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.769863 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.769690 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e82510c1-9a31-40fe-8c55-1b590f89ea23-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.769863 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.769718 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.769863 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.769735 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.769863 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.769752 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnzmr\" (UniqueName: \"kubernetes.io/projected/e82510c1-9a31-40fe-8c55-1b590f89ea23-kube-api-access-gnzmr\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.769863 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.769768 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.769863 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.769782 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.769863 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.769796 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-web-config\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.870933 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.870904 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e82510c1-9a31-40fe-8c55-1b590f89ea23-config-out\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.870933 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.870944 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-config-volume\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.871153 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.870967 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e82510c1-9a31-40fe-8c55-1b590f89ea23-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.871153 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.870985 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.871153 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.871004 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.871153 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.871028 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnzmr\" (UniqueName: \"kubernetes.io/projected/e82510c1-9a31-40fe-8c55-1b590f89ea23-kube-api-access-gnzmr\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.871153 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.871051 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.871153 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.871068 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.871153 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.871085 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-web-config\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.871153 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.871123 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.871153 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:29:12.871152 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e82510c1-9a31-40fe-8c55-1b590f89ea23-alertmanager-trusted-ca-bundle podName:e82510c1-9a31-40fe-8c55-1b590f89ea23 nodeName:}" failed. No retries permitted until 2026-04-17 17:29:13.37112477 +0000 UTC m=+170.044978103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/e82510c1-9a31-40fe-8c55-1b590f89ea23-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "e82510c1-9a31-40fe-8c55-1b590f89ea23") : configmap references non-existent config key: ca-bundle.crt Apr 17 17:29:12.871592 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.871221 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e82510c1-9a31-40fe-8c55-1b590f89ea23-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.871592 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.871262 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e82510c1-9a31-40fe-8c55-1b590f89ea23-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.871592 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.871287 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e82510c1-9a31-40fe-8c55-1b590f89ea23-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.871827 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.871804 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e82510c1-9a31-40fe-8c55-1b590f89ea23-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.872126 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.872105 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e82510c1-9a31-40fe-8c55-1b590f89ea23-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.874188 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.874158 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e82510c1-9a31-40fe-8c55-1b590f89ea23-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.874309 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.874294 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.874965 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.874925 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-config-volume\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.874965 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.874986 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e82510c1-9a31-40fe-8c55-1b590f89ea23-config-out\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.874965 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.875061 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.874965 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.875106 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.874965 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.875130 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.874965 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.875223 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-web-config\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.875593 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.875461 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:12.886711 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:12.886677 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnzmr\" (UniqueName: \"kubernetes.io/projected/e82510c1-9a31-40fe-8c55-1b590f89ea23-kube-api-access-gnzmr\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:13.374671 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.374646 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e82510c1-9a31-40fe-8c55-1b590f89ea23-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:13.375276 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.375251 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e82510c1-9a31-40fe-8c55-1b590f89ea23-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:13.397026 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.397000 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5qcf7" event={"ID":"dd4675a1-0ff5-4c27-a0da-329554311931","Type":"ContainerStarted","Data":"d209b5ae886ffd9698d7f3fbc8fddb39dc538b20e598ce8e12660662269d981d"} Apr 17 17:29:13.397128 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.397034 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5qcf7" event={"ID":"dd4675a1-0ff5-4c27-a0da-329554311931","Type":"ContainerStarted","Data":"b326c45b068340ba1187f5eb1338c48e655343f666f300752d95d6711910f2c7"} Apr 17 17:29:13.602444 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.602375 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-75d47484b8-fqdcg"] Apr 17 17:29:13.605755 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.605739 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.608431 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.608404 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 17:29:13.608431 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.608426 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-2cok7jr2fu66e\"" Apr 17 17:29:13.608622 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.608433 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 17:29:13.608956 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.608929 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-b277z\"" Apr 17 17:29:13.609077 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.608972 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 17:29:13.609077 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.608985 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 17:29:13.609213 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.609176 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 17:29:13.614972 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.614952 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-75d47484b8-fqdcg"] Apr 17 17:29:13.619837 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.619814 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:13.678792 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.678546 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.678792 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.678595 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.678792 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.678637 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/924523ba-7166-413d-b895-964f97a258a0-metrics-client-ca\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.678792 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.678728 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.678792 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.678767 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwmdl\" (UniqueName: \"kubernetes.io/projected/924523ba-7166-413d-b895-964f97a258a0-kube-api-access-dwmdl\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.679188 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.678820 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-grpc-tls\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.679188 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.678875 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-thanos-querier-tls\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.679188 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.678921 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.742250 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.742227 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:29:13.744416 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:29:13.744393 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode82510c1_9a31_40fe_8c55_1b590f89ea23.slice/crio-1a5308994dc68072cfbb108a25dd07f93e0d019b46ea82b11b2e51cc17c3af18 WatchSource:0}: Error finding container 1a5308994dc68072cfbb108a25dd07f93e0d019b46ea82b11b2e51cc17c3af18: Status 404 returned error can't find the container with id 1a5308994dc68072cfbb108a25dd07f93e0d019b46ea82b11b2e51cc17c3af18 Apr 17 17:29:13.780219 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.780199 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.780307 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.780230 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwmdl\" (UniqueName: \"kubernetes.io/projected/924523ba-7166-413d-b895-964f97a258a0-kube-api-access-dwmdl\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.780307 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.780259 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-grpc-tls\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.780307 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.780290 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-thanos-querier-tls\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.780418 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.780314 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.780418 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.780332 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.780418 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.780352 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.780418 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.780384 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/924523ba-7166-413d-b895-964f97a258a0-metrics-client-ca\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.781211 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.781178 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/924523ba-7166-413d-b895-964f97a258a0-metrics-client-ca\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.782827 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.782801 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.782917 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.782886 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.783059 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.783044 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.783149 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.783135 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-thanos-querier-tls\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.783311 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.783290 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.783395 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.783382 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/924523ba-7166-413d-b895-964f97a258a0-secret-grpc-tls\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.787916 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.787901 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwmdl\" (UniqueName: \"kubernetes.io/projected/924523ba-7166-413d-b895-964f97a258a0-kube-api-access-dwmdl\") pod \"thanos-querier-75d47484b8-fqdcg\" (UID: \"924523ba-7166-413d-b895-964f97a258a0\") " pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:13.915521 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:13.915461 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:14.033444 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:14.033415 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-75d47484b8-fqdcg"] Apr 17 17:29:14.036473 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:29:14.036444 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod924523ba_7166_413d_b895_964f97a258a0.slice/crio-073d472475394fb0bb913579d33c2a87307b50cbfe9f5c35cf83205c8206395b WatchSource:0}: Error finding container 073d472475394fb0bb913579d33c2a87307b50cbfe9f5c35cf83205c8206395b: Status 404 returned error can't find the container with id 073d472475394fb0bb913579d33c2a87307b50cbfe9f5c35cf83205c8206395b Apr 17 17:29:14.401239 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:14.401197 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" event={"ID":"924523ba-7166-413d-b895-964f97a258a0","Type":"ContainerStarted","Data":"073d472475394fb0bb913579d33c2a87307b50cbfe9f5c35cf83205c8206395b"} Apr 17 17:29:14.402575 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:14.402535 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e82510c1-9a31-40fe-8c55-1b590f89ea23","Type":"ContainerStarted","Data":"1a5308994dc68072cfbb108a25dd07f93e0d019b46ea82b11b2e51cc17c3af18"} Apr 17 17:29:14.404095 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:14.404066 2571 generic.go:358] "Generic (PLEG): container finished" podID="dd4675a1-0ff5-4c27-a0da-329554311931" containerID="d209b5ae886ffd9698d7f3fbc8fddb39dc538b20e598ce8e12660662269d981d" exitCode=0 Apr 17 17:29:14.404204 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:14.404111 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5qcf7" event={"ID":"dd4675a1-0ff5-4c27-a0da-329554311931","Type":"ContainerDied","Data":"d209b5ae886ffd9698d7f3fbc8fddb39dc538b20e598ce8e12660662269d981d"} Apr 17 17:29:14.911888 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:14.911867 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:29:15.409455 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:15.409404 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5qcf7" event={"ID":"dd4675a1-0ff5-4c27-a0da-329554311931","Type":"ContainerStarted","Data":"5b5a9cb8e800cb1a1c235fcebec101fb7a5f02dfe148124449eb717c21f2aa36"} Apr 17 17:29:15.409455 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:15.409452 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5qcf7" event={"ID":"dd4675a1-0ff5-4c27-a0da-329554311931","Type":"ContainerStarted","Data":"630c88271f2e519f77805f5483f70d94e385891f187e6c65a210d514eef6531b"} Apr 17 17:29:15.410858 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:15.410830 2571 generic.go:358] "Generic (PLEG): container finished" podID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerID="0ffc86e48c6e45646199355ff56489de5f7a4dbfb4ecc216b0db37754d9a0f22" exitCode=0 Apr 17 17:29:15.410963 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:15.410890 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e82510c1-9a31-40fe-8c55-1b590f89ea23","Type":"ContainerDied","Data":"0ffc86e48c6e45646199355ff56489de5f7a4dbfb4ecc216b0db37754d9a0f22"} Apr 17 17:29:15.430526 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:15.430483 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-5qcf7" podStartSLOduration=3.77876908 podStartE2EDuration="4.430468419s" podCreationTimestamp="2026-04-17 17:29:11 +0000 UTC" firstStartedPulling="2026-04-17 17:29:12.593498911 +0000 UTC m=+169.267352226" lastFinishedPulling="2026-04-17 17:29:13.245198251 +0000 UTC m=+169.919051565" observedRunningTime="2026-04-17 17:29:15.429674771 +0000 UTC m=+172.103528136" watchObservedRunningTime="2026-04-17 17:29:15.430468419 +0000 UTC m=+172.104321756" Apr 17 17:29:16.416537 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:16.416490 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" event={"ID":"924523ba-7166-413d-b895-964f97a258a0","Type":"ContainerStarted","Data":"f42a7f19fa98c846fca24fb2197e0ad248c7cb41e012464c2e2b9b20af9d65ea"} Apr 17 17:29:16.416537 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:16.416539 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" event={"ID":"924523ba-7166-413d-b895-964f97a258a0","Type":"ContainerStarted","Data":"de6ab163951599cae127779848b951b34addc245da04305e9d2f509a96918d91"} Apr 17 17:29:16.417004 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:16.416557 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" event={"ID":"924523ba-7166-413d-b895-964f97a258a0","Type":"ContainerStarted","Data":"83df3e4cdf450db3789cfb811caab1786076cd360b037839b85986259cd4d9b2"} Apr 17 17:29:17.384241 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:17.384212 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rr5tw" Apr 17 17:29:17.421623 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:17.421589 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" event={"ID":"924523ba-7166-413d-b895-964f97a258a0","Type":"ContainerStarted","Data":"d188c3838b4e79e53ee8b33354c1b7c5a6e15c01f90ff4b697fd9d706ab588e5"} Apr 17 17:29:17.421623 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:17.421623 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" event={"ID":"924523ba-7166-413d-b895-964f97a258a0","Type":"ContainerStarted","Data":"228f85e2b7dd8395bd68fdef7f00248a1420498ea827952d02e01804f9fa2d90"} Apr 17 17:29:17.422048 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:17.421633 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" event={"ID":"924523ba-7166-413d-b895-964f97a258a0","Type":"ContainerStarted","Data":"8c4b1b2c98ff69c68fe5859c711be2a29d6ddf6f830157a10e624f4ffcb493d9"} Apr 17 17:29:17.422048 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:17.421793 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:17.424180 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:17.424159 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e82510c1-9a31-40fe-8c55-1b590f89ea23","Type":"ContainerStarted","Data":"36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0"} Apr 17 17:29:17.424288 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:17.424186 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e82510c1-9a31-40fe-8c55-1b590f89ea23","Type":"ContainerStarted","Data":"dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca"} Apr 17 17:29:17.424288 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:17.424199 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e82510c1-9a31-40fe-8c55-1b590f89ea23","Type":"ContainerStarted","Data":"6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42"} Apr 17 17:29:17.424288 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:17.424212 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e82510c1-9a31-40fe-8c55-1b590f89ea23","Type":"ContainerStarted","Data":"9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb"} Apr 17 17:29:17.424288 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:17.424224 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e82510c1-9a31-40fe-8c55-1b590f89ea23","Type":"ContainerStarted","Data":"e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81"} Apr 17 17:29:17.424288 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:17.424235 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e82510c1-9a31-40fe-8c55-1b590f89ea23","Type":"ContainerStarted","Data":"714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9"} Apr 17 17:29:17.510581 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:17.510535 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.313205408 podStartE2EDuration="5.510518344s" podCreationTimestamp="2026-04-17 17:29:12 +0000 UTC" firstStartedPulling="2026-04-17 17:29:13.746180325 +0000 UTC m=+170.420033642" lastFinishedPulling="2026-04-17 17:29:16.943493263 +0000 UTC m=+173.617346578" observedRunningTime="2026-04-17 17:29:17.508299213 +0000 UTC m=+174.182152550" watchObservedRunningTime="2026-04-17 17:29:17.510518344 +0000 UTC m=+174.184371689" Apr 17 17:29:17.510802 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:17.510748 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" podStartSLOduration=1.6049490180000001 podStartE2EDuration="4.510739576s" podCreationTimestamp="2026-04-17 17:29:13 +0000 UTC" firstStartedPulling="2026-04-17 17:29:14.038267307 +0000 UTC m=+170.712120621" lastFinishedPulling="2026-04-17 17:29:16.944057864 +0000 UTC m=+173.617911179" observedRunningTime="2026-04-17 17:29:17.463413931 +0000 UTC m=+174.137267268" watchObservedRunningTime="2026-04-17 17:29:17.510739576 +0000 UTC m=+174.184592913" Apr 17 17:29:23.434803 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:23.434776 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-75d47484b8-fqdcg" Apr 17 17:29:36.482203 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:36.482169 2571 generic.go:358] "Generic (PLEG): container finished" podID="0ba9623b-16db-45c3-b260-9df660f50aa1" containerID="e4e040159cb7d4f2e91f3ec671ed5300ec98fb73c60c39a6e726e40789994f7b" exitCode=0 Apr 17 17:29:36.482683 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:36.482228 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j" event={"ID":"0ba9623b-16db-45c3-b260-9df660f50aa1","Type":"ContainerDied","Data":"e4e040159cb7d4f2e91f3ec671ed5300ec98fb73c60c39a6e726e40789994f7b"} Apr 17 17:29:36.482683 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:36.482520 2571 scope.go:117] "RemoveContainer" containerID="e4e040159cb7d4f2e91f3ec671ed5300ec98fb73c60c39a6e726e40789994f7b" Apr 17 17:29:37.486690 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:29:37.486662 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sw58j" event={"ID":"0ba9623b-16db-45c3-b260-9df660f50aa1","Type":"ContainerStarted","Data":"bc1cd9653cb9fcdca12b695a38a56f1fc16983b81e1f3cc61bf89af451f9d9f5"} Apr 17 17:30:31.961355 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:31.961321 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:30:31.961864 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:31.961749 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="alertmanager" containerID="cri-o://714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9" gracePeriod=120 Apr 17 17:30:31.961864 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:31.961784 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="kube-rbac-proxy-metric" containerID="cri-o://dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca" gracePeriod=120 Apr 17 17:30:31.961864 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:31.961792 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="kube-rbac-proxy-web" containerID="cri-o://9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb" gracePeriod=120 Apr 17 17:30:31.961864 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:31.961822 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="kube-rbac-proxy" containerID="cri-o://6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42" gracePeriod=120 Apr 17 17:30:31.962080 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:31.961856 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="prom-label-proxy" containerID="cri-o://36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0" gracePeriod=120 Apr 17 17:30:31.962080 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:31.961871 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="config-reloader" containerID="cri-o://e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81" gracePeriod=120 Apr 17 17:30:32.644140 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:32.644108 2571 generic.go:358] "Generic (PLEG): container finished" podID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerID="36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0" exitCode=0 Apr 17 17:30:32.644140 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:32.644134 2571 generic.go:358] "Generic (PLEG): container finished" podID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerID="6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42" exitCode=0 Apr 17 17:30:32.644140 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:32.644145 2571 generic.go:358] "Generic (PLEG): container finished" podID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerID="e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81" exitCode=0 Apr 17 17:30:32.644140 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:32.644151 2571 generic.go:358] "Generic (PLEG): container finished" podID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerID="714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9" exitCode=0 Apr 17 17:30:32.644426 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:32.644185 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e82510c1-9a31-40fe-8c55-1b590f89ea23","Type":"ContainerDied","Data":"36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0"} Apr 17 17:30:32.644426 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:32.644223 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e82510c1-9a31-40fe-8c55-1b590f89ea23","Type":"ContainerDied","Data":"6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42"} Apr 17 17:30:32.644426 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:32.644238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e82510c1-9a31-40fe-8c55-1b590f89ea23","Type":"ContainerDied","Data":"e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81"} Apr 17 17:30:32.644426 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:32.644250 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e82510c1-9a31-40fe-8c55-1b590f89ea23","Type":"ContainerDied","Data":"714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9"} Apr 17 17:30:33.196019 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.195998 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.300519 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.300438 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy\") pod \"e82510c1-9a31-40fe-8c55-1b590f89ea23\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " Apr 17 17:30:33.300519 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.300473 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-cluster-tls-config\") pod \"e82510c1-9a31-40fe-8c55-1b590f89ea23\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " Apr 17 17:30:33.300519 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.300495 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-web-config\") pod \"e82510c1-9a31-40fe-8c55-1b590f89ea23\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " Apr 17 17:30:33.300804 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.300519 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-main-tls\") pod \"e82510c1-9a31-40fe-8c55-1b590f89ea23\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " Apr 17 17:30:33.300804 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.300550 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-config-volume\") pod \"e82510c1-9a31-40fe-8c55-1b590f89ea23\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " Apr 17 17:30:33.300804 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.300595 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy-metric\") pod \"e82510c1-9a31-40fe-8c55-1b590f89ea23\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " Apr 17 17:30:33.300804 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.300628 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e82510c1-9a31-40fe-8c55-1b590f89ea23-alertmanager-main-db\") pod \"e82510c1-9a31-40fe-8c55-1b590f89ea23\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " Apr 17 17:30:33.300804 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.300660 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e82510c1-9a31-40fe-8c55-1b590f89ea23-alertmanager-trusted-ca-bundle\") pod \"e82510c1-9a31-40fe-8c55-1b590f89ea23\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " Apr 17 17:30:33.300804 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.300729 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e82510c1-9a31-40fe-8c55-1b590f89ea23-tls-assets\") pod \"e82510c1-9a31-40fe-8c55-1b590f89ea23\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " Apr 17 17:30:33.300804 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.300753 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e82510c1-9a31-40fe-8c55-1b590f89ea23-config-out\") pod \"e82510c1-9a31-40fe-8c55-1b590f89ea23\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " Apr 17 17:30:33.300804 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.300784 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e82510c1-9a31-40fe-8c55-1b590f89ea23-metrics-client-ca\") pod \"e82510c1-9a31-40fe-8c55-1b590f89ea23\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " Apr 17 17:30:33.301203 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.300823 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy-web\") pod \"e82510c1-9a31-40fe-8c55-1b590f89ea23\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " Apr 17 17:30:33.301203 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.300852 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnzmr\" (UniqueName: \"kubernetes.io/projected/e82510c1-9a31-40fe-8c55-1b590f89ea23-kube-api-access-gnzmr\") pod \"e82510c1-9a31-40fe-8c55-1b590f89ea23\" (UID: \"e82510c1-9a31-40fe-8c55-1b590f89ea23\") " Apr 17 17:30:33.301313 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.301222 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e82510c1-9a31-40fe-8c55-1b590f89ea23-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "e82510c1-9a31-40fe-8c55-1b590f89ea23" (UID: "e82510c1-9a31-40fe-8c55-1b590f89ea23"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:30:33.301766 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.301692 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e82510c1-9a31-40fe-8c55-1b590f89ea23-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "e82510c1-9a31-40fe-8c55-1b590f89ea23" (UID: "e82510c1-9a31-40fe-8c55-1b590f89ea23"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:30:33.301982 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.301954 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e82510c1-9a31-40fe-8c55-1b590f89ea23-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "e82510c1-9a31-40fe-8c55-1b590f89ea23" (UID: "e82510c1-9a31-40fe-8c55-1b590f89ea23"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:30:33.303876 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.303836 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "e82510c1-9a31-40fe-8c55-1b590f89ea23" (UID: "e82510c1-9a31-40fe-8c55-1b590f89ea23"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:33.304114 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.304089 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e82510c1-9a31-40fe-8c55-1b590f89ea23-kube-api-access-gnzmr" (OuterVolumeSpecName: "kube-api-access-gnzmr") pod "e82510c1-9a31-40fe-8c55-1b590f89ea23" (UID: "e82510c1-9a31-40fe-8c55-1b590f89ea23"). InnerVolumeSpecName "kube-api-access-gnzmr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:30:33.304114 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.304103 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-config-volume" (OuterVolumeSpecName: "config-volume") pod "e82510c1-9a31-40fe-8c55-1b590f89ea23" (UID: "e82510c1-9a31-40fe-8c55-1b590f89ea23"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:33.304251 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.304088 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "e82510c1-9a31-40fe-8c55-1b590f89ea23" (UID: "e82510c1-9a31-40fe-8c55-1b590f89ea23"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:33.304251 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.304125 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "e82510c1-9a31-40fe-8c55-1b590f89ea23" (UID: "e82510c1-9a31-40fe-8c55-1b590f89ea23"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:33.304490 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.304472 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e82510c1-9a31-40fe-8c55-1b590f89ea23-config-out" (OuterVolumeSpecName: "config-out") pod "e82510c1-9a31-40fe-8c55-1b590f89ea23" (UID: "e82510c1-9a31-40fe-8c55-1b590f89ea23"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:30:33.305105 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.305082 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e82510c1-9a31-40fe-8c55-1b590f89ea23-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e82510c1-9a31-40fe-8c55-1b590f89ea23" (UID: "e82510c1-9a31-40fe-8c55-1b590f89ea23"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:30:33.305321 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.305301 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "e82510c1-9a31-40fe-8c55-1b590f89ea23" (UID: "e82510c1-9a31-40fe-8c55-1b590f89ea23"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:33.307596 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.307577 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "e82510c1-9a31-40fe-8c55-1b590f89ea23" (UID: "e82510c1-9a31-40fe-8c55-1b590f89ea23"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:33.313318 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.313294 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-web-config" (OuterVolumeSpecName: "web-config") pod "e82510c1-9a31-40fe-8c55-1b590f89ea23" (UID: "e82510c1-9a31-40fe-8c55-1b590f89ea23"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:33.402156 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.402111 2571 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e82510c1-9a31-40fe-8c55-1b590f89ea23-metrics-client-ca\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:30:33.402156 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.402152 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:30:33.402156 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.402163 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gnzmr\" (UniqueName: \"kubernetes.io/projected/e82510c1-9a31-40fe-8c55-1b590f89ea23-kube-api-access-gnzmr\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:30:33.402156 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.402173 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:30:33.402404 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.402183 2571 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-cluster-tls-config\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:30:33.402404 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.402192 2571 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-web-config\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:30:33.402404 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.402200 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-main-tls\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:30:33.402404 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.402209 2571 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-config-volume\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:30:33.402404 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.402218 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e82510c1-9a31-40fe-8c55-1b590f89ea23-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:30:33.402404 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.402226 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e82510c1-9a31-40fe-8c55-1b590f89ea23-alertmanager-main-db\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:30:33.402404 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.402236 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e82510c1-9a31-40fe-8c55-1b590f89ea23-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:30:33.402404 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.402246 2571 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e82510c1-9a31-40fe-8c55-1b590f89ea23-tls-assets\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:30:33.402404 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.402253 2571 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e82510c1-9a31-40fe-8c55-1b590f89ea23-config-out\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:30:33.649561 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.649480 2571 generic.go:358] "Generic (PLEG): container finished" podID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerID="dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca" exitCode=0 Apr 17 17:30:33.649561 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.649505 2571 generic.go:358] "Generic (PLEG): container finished" podID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerID="9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb" exitCode=0 Apr 17 17:30:33.649808 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.649569 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e82510c1-9a31-40fe-8c55-1b590f89ea23","Type":"ContainerDied","Data":"dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca"} Apr 17 17:30:33.649808 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.649618 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e82510c1-9a31-40fe-8c55-1b590f89ea23","Type":"ContainerDied","Data":"9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb"} Apr 17 17:30:33.649808 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.649633 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e82510c1-9a31-40fe-8c55-1b590f89ea23","Type":"ContainerDied","Data":"1a5308994dc68072cfbb108a25dd07f93e0d019b46ea82b11b2e51cc17c3af18"} Apr 17 17:30:33.649808 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.649651 2571 scope.go:117] "RemoveContainer" containerID="36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0" Apr 17 17:30:33.649808 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.649580 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.662682 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.662664 2571 scope.go:117] "RemoveContainer" containerID="dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca" Apr 17 17:30:33.669582 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.669566 2571 scope.go:117] "RemoveContainer" containerID="6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42" Apr 17 17:30:33.675890 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.675870 2571 scope.go:117] "RemoveContainer" containerID="9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb" Apr 17 17:30:33.676831 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.676812 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:30:33.680345 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.680327 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:30:33.682358 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.682343 2571 scope.go:117] "RemoveContainer" containerID="e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81" Apr 17 17:30:33.688372 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.688357 2571 scope.go:117] "RemoveContainer" containerID="714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9" Apr 17 17:30:33.694344 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.694320 2571 scope.go:117] "RemoveContainer" containerID="0ffc86e48c6e45646199355ff56489de5f7a4dbfb4ecc216b0db37754d9a0f22" Apr 17 17:30:33.700426 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.700408 2571 scope.go:117] "RemoveContainer" containerID="36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0" Apr 17 17:30:33.700681 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:30:33.700663 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0\": container with ID starting with 36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0 not found: ID does not exist" containerID="36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0" Apr 17 17:30:33.700753 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.700690 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0"} err="failed to get container status \"36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0\": rpc error: code = NotFound desc = could not find container \"36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0\": container with ID starting with 36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0 not found: ID does not exist" Apr 17 17:30:33.700753 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.700730 2571 scope.go:117] "RemoveContainer" containerID="dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca" Apr 17 17:30:33.700951 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:30:33.700934 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca\": container with ID starting with dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca not found: ID does not exist" containerID="dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca" Apr 17 17:30:33.700993 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.700957 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca"} err="failed to get container status \"dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca\": rpc error: code = NotFound desc = could not find container \"dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca\": container with ID starting with dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca not found: ID does not exist" Apr 17 17:30:33.700993 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.700971 2571 scope.go:117] "RemoveContainer" containerID="6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42" Apr 17 17:30:33.701199 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:30:33.701182 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42\": container with ID starting with 6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42 not found: ID does not exist" containerID="6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42" Apr 17 17:30:33.701250 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.701203 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42"} err="failed to get container status \"6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42\": rpc error: code = NotFound desc = could not find container \"6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42\": container with ID starting with 6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42 not found: ID does not exist" Apr 17 17:30:33.701250 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.701218 2571 scope.go:117] "RemoveContainer" containerID="9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb" Apr 17 17:30:33.701427 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:30:33.701412 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb\": container with ID starting with 9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb not found: ID does not exist" containerID="9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb" Apr 17 17:30:33.701466 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.701431 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb"} err="failed to get container status \"9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb\": rpc error: code = NotFound desc = could not find container \"9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb\": container with ID starting with 9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb not found: ID does not exist" Apr 17 17:30:33.701466 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.701445 2571 scope.go:117] "RemoveContainer" containerID="e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81" Apr 17 17:30:33.701643 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:30:33.701628 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81\": container with ID starting with e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81 not found: ID does not exist" containerID="e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81" Apr 17 17:30:33.701681 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.701648 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81"} err="failed to get container status \"e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81\": rpc error: code = NotFound desc = could not find container \"e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81\": container with ID starting with e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81 not found: ID does not exist" Apr 17 17:30:33.701681 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.701660 2571 scope.go:117] "RemoveContainer" containerID="714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9" Apr 17 17:30:33.701947 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:30:33.701932 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9\": container with ID starting with 714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9 not found: ID does not exist" containerID="714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9" Apr 17 17:30:33.701988 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.701950 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9"} err="failed to get container status \"714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9\": rpc error: code = NotFound desc = could not find container \"714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9\": container with ID starting with 714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9 not found: ID does not exist" Apr 17 17:30:33.701988 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.701962 2571 scope.go:117] "RemoveContainer" containerID="0ffc86e48c6e45646199355ff56489de5f7a4dbfb4ecc216b0db37754d9a0f22" Apr 17 17:30:33.702141 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:30:33.702125 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ffc86e48c6e45646199355ff56489de5f7a4dbfb4ecc216b0db37754d9a0f22\": container with ID starting with 0ffc86e48c6e45646199355ff56489de5f7a4dbfb4ecc216b0db37754d9a0f22 not found: ID does not exist" containerID="0ffc86e48c6e45646199355ff56489de5f7a4dbfb4ecc216b0db37754d9a0f22" Apr 17 17:30:33.702202 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.702150 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ffc86e48c6e45646199355ff56489de5f7a4dbfb4ecc216b0db37754d9a0f22"} err="failed to get container status \"0ffc86e48c6e45646199355ff56489de5f7a4dbfb4ecc216b0db37754d9a0f22\": rpc error: code = NotFound desc = could not find container \"0ffc86e48c6e45646199355ff56489de5f7a4dbfb4ecc216b0db37754d9a0f22\": container with ID starting with 0ffc86e48c6e45646199355ff56489de5f7a4dbfb4ecc216b0db37754d9a0f22 not found: ID does not exist" Apr 17 17:30:33.702202 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.702169 2571 scope.go:117] "RemoveContainer" containerID="36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0" Apr 17 17:30:33.702417 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.702399 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0"} err="failed to get container status \"36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0\": rpc error: code = NotFound desc = could not find container \"36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0\": container with ID starting with 36a219ec7357102425ab2c0f063ce5facc0335c736cb0726e2b889104c373db0 not found: ID does not exist" Apr 17 17:30:33.702469 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.702418 2571 scope.go:117] "RemoveContainer" containerID="dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca" Apr 17 17:30:33.702614 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.702598 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca"} err="failed to get container status \"dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca\": rpc error: code = NotFound desc = could not find container \"dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca\": container with ID starting with dba3dc900634f34fd888bb356c361b077b8e32f0bb681064934033e7042a0bca not found: ID does not exist" Apr 17 17:30:33.702660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.702615 2571 scope.go:117] "RemoveContainer" containerID="6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42" Apr 17 17:30:33.702851 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.702833 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42"} err="failed to get container status \"6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42\": rpc error: code = NotFound desc = could not find container \"6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42\": container with ID starting with 6fd1c1d6f43e43c983d74ed74549f7e3cdcb1970d485d5986176c8018c090d42 not found: ID does not exist" Apr 17 17:30:33.702908 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.702852 2571 scope.go:117] "RemoveContainer" containerID="9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb" Apr 17 17:30:33.703039 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.703023 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb"} err="failed to get container status \"9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb\": rpc error: code = NotFound desc = could not find container \"9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb\": container with ID starting with 9f5f3d4cecd61bca7cd7193349103485db2cfce06614724828bc076f18f570cb not found: ID does not exist" Apr 17 17:30:33.703039 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.703038 2571 scope.go:117] "RemoveContainer" containerID="e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81" Apr 17 17:30:33.703221 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.703206 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81"} err="failed to get container status \"e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81\": rpc error: code = NotFound desc = could not find container \"e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81\": container with ID starting with e558a9acaa8d53a547b85de36a3b76c62c57ae7982cff1eaf5758f0bd7f50c81 not found: ID does not exist" Apr 17 17:30:33.703262 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.703221 2571 scope.go:117] "RemoveContainer" containerID="714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9" Apr 17 17:30:33.703395 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.703380 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9"} err="failed to get container status \"714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9\": rpc error: code = NotFound desc = could not find container \"714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9\": container with ID starting with 714240e65e2dd8a470a9aa8079e263c31b5adabc61640079780ac4bbdaa279c9 not found: ID does not exist" Apr 17 17:30:33.703438 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.703396 2571 scope.go:117] "RemoveContainer" containerID="0ffc86e48c6e45646199355ff56489de5f7a4dbfb4ecc216b0db37754d9a0f22" Apr 17 17:30:33.703606 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.703588 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ffc86e48c6e45646199355ff56489de5f7a4dbfb4ecc216b0db37754d9a0f22"} err="failed to get container status \"0ffc86e48c6e45646199355ff56489de5f7a4dbfb4ecc216b0db37754d9a0f22\": rpc error: code = NotFound desc = could not find container \"0ffc86e48c6e45646199355ff56489de5f7a4dbfb4ecc216b0db37754d9a0f22\": container with ID starting with 0ffc86e48c6e45646199355ff56489de5f7a4dbfb4ecc216b0db37754d9a0f22 not found: ID does not exist" Apr 17 17:30:33.706190 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706173 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:30:33.706443 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706432 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="init-config-reloader" Apr 17 17:30:33.706485 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706445 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="init-config-reloader" Apr 17 17:30:33.706485 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706459 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="prom-label-proxy" Apr 17 17:30:33.706485 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706464 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="prom-label-proxy" Apr 17 17:30:33.706485 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706471 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="kube-rbac-proxy-web" Apr 17 17:30:33.706485 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706477 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="kube-rbac-proxy-web" Apr 17 17:30:33.706485 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706483 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="kube-rbac-proxy" Apr 17 17:30:33.706485 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706487 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="kube-rbac-proxy" Apr 17 17:30:33.706667 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706497 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="alertmanager" Apr 17 17:30:33.706667 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706504 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="alertmanager" Apr 17 17:30:33.706667 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706516 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="config-reloader" Apr 17 17:30:33.706667 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706522 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="config-reloader" Apr 17 17:30:33.706667 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706527 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="kube-rbac-proxy-metric" Apr 17 17:30:33.706667 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706532 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="kube-rbac-proxy-metric" Apr 17 17:30:33.706667 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706572 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="config-reloader" Apr 17 17:30:33.706667 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706581 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="kube-rbac-proxy" Apr 17 17:30:33.706667 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706586 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="prom-label-proxy" Apr 17 17:30:33.706667 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706593 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="alertmanager" Apr 17 17:30:33.706667 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706599 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="kube-rbac-proxy-web" Apr 17 17:30:33.706667 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.706605 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" containerName="kube-rbac-proxy-metric" Apr 17 17:30:33.711364 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.711348 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.714746 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.714675 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 17:30:33.714975 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.714949 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 17:30:33.714975 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.714960 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-fhgjs\"" Apr 17 17:30:33.715108 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.714962 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 17:30:33.715108 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.714963 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 17:30:33.715214 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.715172 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 17:30:33.715214 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.715190 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 17:30:33.715302 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.715210 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 17:30:33.715347 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.715296 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 17:30:33.719406 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.719368 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 17:30:33.722897 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.722879 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:30:33.805884 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.805852 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-config-volume\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.805884 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.805888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-web-config\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.806053 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.805913 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49h8q\" (UniqueName: \"kubernetes.io/projected/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-kube-api-access-49h8q\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.806053 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.805934 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.806053 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.806000 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.806053 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.806029 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.806221 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.806058 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-config-out\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.806221 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.806086 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.806221 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.806103 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.806221 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.806129 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.806221 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.806155 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.806221 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.806177 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.806221 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.806197 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.906981 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.906895 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-config-volume\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.906981 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.906938 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-web-config\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.907184 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.906998 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49h8q\" (UniqueName: \"kubernetes.io/projected/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-kube-api-access-49h8q\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.907184 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.907030 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.907184 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.907056 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.907184 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.907079 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.907184 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.907123 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-config-out\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.907184 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.907147 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.907184 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.907173 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.907514 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.907216 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.907514 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.907262 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.907514 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.907285 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.907514 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.907320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.908063 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.908037 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.908300 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.908279 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.908449 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.908423 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.909869 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.909845 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-config-out\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.910126 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.910067 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.910241 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.910216 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.910294 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.910269 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.910354 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.910269 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-config-volume\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.910659 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.910643 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-web-config\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.910694 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.910640 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.910921 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.910907 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.911755 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.911742 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.914848 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.914829 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49h8q\" (UniqueName: \"kubernetes.io/projected/d6ed31df-b9fd-4360-9928-dee0dfd5ecfd-kube-api-access-49h8q\") pod \"alertmanager-main-0\" (UID: \"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:33.915290 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:33.915268 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e82510c1-9a31-40fe-8c55-1b590f89ea23" path="/var/lib/kubelet/pods/e82510c1-9a31-40fe-8c55-1b590f89ea23/volumes" Apr 17 17:30:34.022057 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:34.022028 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:30:34.144386 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:34.144357 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:30:34.147358 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:30:34.147329 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6ed31df_b9fd_4360_9928_dee0dfd5ecfd.slice/crio-aef49a56026fb435feef53c23ec3acb878ba8d598a68aa7a6b77f91040b949d3 WatchSource:0}: Error finding container aef49a56026fb435feef53c23ec3acb878ba8d598a68aa7a6b77f91040b949d3: Status 404 returned error can't find the container with id aef49a56026fb435feef53c23ec3acb878ba8d598a68aa7a6b77f91040b949d3 Apr 17 17:30:34.653399 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:34.653358 2571 generic.go:358] "Generic (PLEG): container finished" podID="d6ed31df-b9fd-4360-9928-dee0dfd5ecfd" containerID="6c89517158f800b8af165991c3781d1f86e88f36ad706829e938f38a33200c02" exitCode=0 Apr 17 17:30:34.653864 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:34.653445 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd","Type":"ContainerDied","Data":"6c89517158f800b8af165991c3781d1f86e88f36ad706829e938f38a33200c02"} Apr 17 17:30:34.653864 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:34.653481 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd","Type":"ContainerStarted","Data":"aef49a56026fb435feef53c23ec3acb878ba8d598a68aa7a6b77f91040b949d3"} Apr 17 17:30:35.619821 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.619782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs\") pod \"network-metrics-daemon-fczvt\" (UID: \"8f82208d-ba82-434a-a209-d69847b4e54b\") " pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:30:35.621969 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.621952 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f82208d-ba82-434a-a209-d69847b4e54b-metrics-certs\") pod \"network-metrics-daemon-fczvt\" (UID: \"8f82208d-ba82-434a-a209-d69847b4e54b\") " pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:30:35.660346 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.660320 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd","Type":"ContainerStarted","Data":"6916ec09c8a0768704ad1695a971f5ff5671050d0505d65f4575b153452b0f57"} Apr 17 17:30:35.660733 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.660353 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd","Type":"ContainerStarted","Data":"e9d458dee9e9452067fedc0780e74509d237f08adaf39eaefa3724d16c118552"} Apr 17 17:30:35.660733 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.660362 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd","Type":"ContainerStarted","Data":"7e5e5a92b9b5e079a053a6fcb27714d639ba752edf79abc0f3ff4f743b1a0a7f"} Apr 17 17:30:35.660733 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.660371 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd","Type":"ContainerStarted","Data":"18830cbb3430f5f65dbae50f5bdb0ed4d42096b94ea4681829d9e4999abb059b"} Apr 17 17:30:35.660733 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.660379 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd","Type":"ContainerStarted","Data":"276bbb13f17713f7a3c730a26ef207750fba81d4510afec76cd17df8854949c6"} Apr 17 17:30:35.660733 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.660387 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d6ed31df-b9fd-4360-9928-dee0dfd5ecfd","Type":"ContainerStarted","Data":"641a37de0972b84fc9cc21664e79ab50f228cf064fe7001c3f616ee66852303f"} Apr 17 17:30:35.689148 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.689104 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.689091375 podStartE2EDuration="2.689091375s" podCreationTimestamp="2026-04-17 17:30:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:30:35.687805581 +0000 UTC m=+252.361658916" watchObservedRunningTime="2026-04-17 17:30:35.689091375 +0000 UTC m=+252.362944711" Apr 17 17:30:35.915156 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.915071 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q7kb6\"" Apr 17 17:30:35.923369 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.923345 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fczvt" Apr 17 17:30:35.969077 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.969044 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-9f65df4fc-5wqfp"] Apr 17 17:30:35.974763 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.974739 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:35.978374 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.977966 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 17:30:35.978374 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.978168 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 17:30:35.978374 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.978212 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 17:30:35.978374 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.978341 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 17:30:35.978656 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.978424 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 17:30:35.978656 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.978538 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-gpbt9\"" Apr 17 17:30:35.987380 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.987159 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-9f65df4fc-5wqfp"] Apr 17 17:30:35.988116 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:35.987818 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 17:30:36.024924 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.024888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5744fb14-ebdf-446f-8539-4368921db8c4-telemeter-client-tls\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.024924 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.024926 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/5744fb14-ebdf-446f-8539-4368921db8c4-federate-client-tls\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.025129 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.024961 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fgwh\" (UniqueName: \"kubernetes.io/projected/5744fb14-ebdf-446f-8539-4368921db8c4-kube-api-access-5fgwh\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.025129 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.024984 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5744fb14-ebdf-446f-8539-4368921db8c4-metrics-client-ca\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.025129 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.025003 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5744fb14-ebdf-446f-8539-4368921db8c4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.025129 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.025033 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/5744fb14-ebdf-446f-8539-4368921db8c4-secret-telemeter-client\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.025325 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.025120 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5744fb14-ebdf-446f-8539-4368921db8c4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.025325 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.025178 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5744fb14-ebdf-446f-8539-4368921db8c4-serving-certs-ca-bundle\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.060431 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.060333 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fczvt"] Apr 17 17:30:36.062441 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:30:36.062410 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f82208d_ba82_434a_a209_d69847b4e54b.slice/crio-5f05c87ddb93855696713845ea6703393f7d3ab80035aaa4d8b37ade7fb95545 WatchSource:0}: Error finding container 5f05c87ddb93855696713845ea6703393f7d3ab80035aaa4d8b37ade7fb95545: Status 404 returned error can't find the container with id 5f05c87ddb93855696713845ea6703393f7d3ab80035aaa4d8b37ade7fb95545 Apr 17 17:30:36.126248 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.126201 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fgwh\" (UniqueName: \"kubernetes.io/projected/5744fb14-ebdf-446f-8539-4368921db8c4-kube-api-access-5fgwh\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.126248 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.126241 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5744fb14-ebdf-446f-8539-4368921db8c4-metrics-client-ca\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.126493 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.126271 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5744fb14-ebdf-446f-8539-4368921db8c4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.126493 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.126291 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/5744fb14-ebdf-446f-8539-4368921db8c4-secret-telemeter-client\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.126493 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.126336 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5744fb14-ebdf-446f-8539-4368921db8c4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.126493 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.126375 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5744fb14-ebdf-446f-8539-4368921db8c4-serving-certs-ca-bundle\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.126493 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.126417 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5744fb14-ebdf-446f-8539-4368921db8c4-telemeter-client-tls\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.126493 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.126444 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/5744fb14-ebdf-446f-8539-4368921db8c4-federate-client-tls\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.127243 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.127216 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5744fb14-ebdf-446f-8539-4368921db8c4-serving-certs-ca-bundle\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.127366 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.127282 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5744fb14-ebdf-446f-8539-4368921db8c4-metrics-client-ca\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.127366 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.127349 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5744fb14-ebdf-446f-8539-4368921db8c4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.129450 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.129416 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5744fb14-ebdf-446f-8539-4368921db8c4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.129552 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.129533 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5744fb14-ebdf-446f-8539-4368921db8c4-telemeter-client-tls\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.129610 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.129552 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/5744fb14-ebdf-446f-8539-4368921db8c4-secret-telemeter-client\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.129610 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.129567 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/5744fb14-ebdf-446f-8539-4368921db8c4-federate-client-tls\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.135356 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.135330 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fgwh\" (UniqueName: \"kubernetes.io/projected/5744fb14-ebdf-446f-8539-4368921db8c4-kube-api-access-5fgwh\") pod \"telemeter-client-9f65df4fc-5wqfp\" (UID: \"5744fb14-ebdf-446f-8539-4368921db8c4\") " pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.293363 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.292940 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" Apr 17 17:30:36.437981 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.437953 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-9f65df4fc-5wqfp"] Apr 17 17:30:36.439865 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:30:36.439831 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5744fb14_ebdf_446f_8539_4368921db8c4.slice/crio-f20fe6997109e78a31556595cb17fd5375d3af3a4a7566f99ef7f9e658ef3a46 WatchSource:0}: Error finding container f20fe6997109e78a31556595cb17fd5375d3af3a4a7566f99ef7f9e658ef3a46: Status 404 returned error can't find the container with id f20fe6997109e78a31556595cb17fd5375d3af3a4a7566f99ef7f9e658ef3a46 Apr 17 17:30:36.668741 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.668631 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fczvt" event={"ID":"8f82208d-ba82-434a-a209-d69847b4e54b","Type":"ContainerStarted","Data":"5f05c87ddb93855696713845ea6703393f7d3ab80035aaa4d8b37ade7fb95545"} Apr 17 17:30:36.670568 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:36.670530 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" event={"ID":"5744fb14-ebdf-446f-8539-4368921db8c4","Type":"ContainerStarted","Data":"f20fe6997109e78a31556595cb17fd5375d3af3a4a7566f99ef7f9e658ef3a46"} Apr 17 17:30:37.674913 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:37.674879 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fczvt" event={"ID":"8f82208d-ba82-434a-a209-d69847b4e54b","Type":"ContainerStarted","Data":"7acc2ac8fe4907092f783cb43df3593d6ece31c6a4a301d7f0186d8af3bda022"} Apr 17 17:30:37.674913 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:37.674915 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fczvt" event={"ID":"8f82208d-ba82-434a-a209-d69847b4e54b","Type":"ContainerStarted","Data":"3e377808a29d0834720c835214009ecff56a44cd655590e3cc94b74b2e68f534"} Apr 17 17:30:37.693062 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:37.693018 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fczvt" podStartSLOduration=252.76008207 podStartE2EDuration="4m13.693005988s" podCreationTimestamp="2026-04-17 17:26:24 +0000 UTC" firstStartedPulling="2026-04-17 17:30:36.064426934 +0000 UTC m=+252.738280247" lastFinishedPulling="2026-04-17 17:30:36.997350849 +0000 UTC m=+253.671204165" observedRunningTime="2026-04-17 17:30:37.691205345 +0000 UTC m=+254.365058685" watchObservedRunningTime="2026-04-17 17:30:37.693005988 +0000 UTC m=+254.366859324" Apr 17 17:30:38.681922 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:38.681829 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" event={"ID":"5744fb14-ebdf-446f-8539-4368921db8c4","Type":"ContainerStarted","Data":"dc63c2071de7c1dc1093d567149c60b85e8d28e5737852cbe271be64c66eabf8"} Apr 17 17:30:38.681922 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:38.681866 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" event={"ID":"5744fb14-ebdf-446f-8539-4368921db8c4","Type":"ContainerStarted","Data":"090d6d93c9cfe7d61125f70cc6da41afde93d856122e91a6753197ba004ef899"} Apr 17 17:30:38.681922 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:38.681875 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" event={"ID":"5744fb14-ebdf-446f-8539-4368921db8c4","Type":"ContainerStarted","Data":"6adcffacdf3658f05139a6901ad4af9f202115c59d53df46c1caf1f3a2935504"} Apr 17 17:30:38.706040 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:30:38.705982 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-9f65df4fc-5wqfp" podStartSLOduration=1.824871258 podStartE2EDuration="3.705967215s" podCreationTimestamp="2026-04-17 17:30:35 +0000 UTC" firstStartedPulling="2026-04-17 17:30:36.442139841 +0000 UTC m=+253.115993159" lastFinishedPulling="2026-04-17 17:30:38.323235788 +0000 UTC m=+254.997089116" observedRunningTime="2026-04-17 17:30:38.704680722 +0000 UTC m=+255.378534057" watchObservedRunningTime="2026-04-17 17:30:38.705967215 +0000 UTC m=+255.379820550" Apr 17 17:31:23.808169 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:31:23.808148 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:32:12.195404 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:12.195328 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-4czqt"] Apr 17 17:32:12.197557 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:12.197542 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4czqt" Apr 17 17:32:12.200005 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:12.199986 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:32:12.205872 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:12.205852 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4czqt"] Apr 17 17:32:12.325073 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:12.325047 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/203be660-71e2-4aef-9c05-185d66f59ebb-kubelet-config\") pod \"global-pull-secret-syncer-4czqt\" (UID: \"203be660-71e2-4aef-9c05-185d66f59ebb\") " pod="kube-system/global-pull-secret-syncer-4czqt" Apr 17 17:32:12.325228 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:12.325081 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/203be660-71e2-4aef-9c05-185d66f59ebb-dbus\") pod \"global-pull-secret-syncer-4czqt\" (UID: \"203be660-71e2-4aef-9c05-185d66f59ebb\") " pod="kube-system/global-pull-secret-syncer-4czqt" Apr 17 17:32:12.325228 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:12.325151 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/203be660-71e2-4aef-9c05-185d66f59ebb-original-pull-secret\") pod \"global-pull-secret-syncer-4czqt\" (UID: \"203be660-71e2-4aef-9c05-185d66f59ebb\") " pod="kube-system/global-pull-secret-syncer-4czqt" Apr 17 17:32:12.426267 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:12.426234 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/203be660-71e2-4aef-9c05-185d66f59ebb-kubelet-config\") pod \"global-pull-secret-syncer-4czqt\" (UID: \"203be660-71e2-4aef-9c05-185d66f59ebb\") " pod="kube-system/global-pull-secret-syncer-4czqt" Apr 17 17:32:12.426360 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:12.426278 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/203be660-71e2-4aef-9c05-185d66f59ebb-dbus\") pod \"global-pull-secret-syncer-4czqt\" (UID: \"203be660-71e2-4aef-9c05-185d66f59ebb\") " pod="kube-system/global-pull-secret-syncer-4czqt" Apr 17 17:32:12.426360 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:12.426317 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/203be660-71e2-4aef-9c05-185d66f59ebb-original-pull-secret\") pod \"global-pull-secret-syncer-4czqt\" (UID: \"203be660-71e2-4aef-9c05-185d66f59ebb\") " pod="kube-system/global-pull-secret-syncer-4czqt" Apr 17 17:32:12.426429 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:12.426354 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/203be660-71e2-4aef-9c05-185d66f59ebb-kubelet-config\") pod \"global-pull-secret-syncer-4czqt\" (UID: \"203be660-71e2-4aef-9c05-185d66f59ebb\") " pod="kube-system/global-pull-secret-syncer-4czqt" Apr 17 17:32:12.426521 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:12.426503 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/203be660-71e2-4aef-9c05-185d66f59ebb-dbus\") pod \"global-pull-secret-syncer-4czqt\" (UID: \"203be660-71e2-4aef-9c05-185d66f59ebb\") " pod="kube-system/global-pull-secret-syncer-4czqt" Apr 17 17:32:12.428601 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:12.428575 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/203be660-71e2-4aef-9c05-185d66f59ebb-original-pull-secret\") pod \"global-pull-secret-syncer-4czqt\" (UID: \"203be660-71e2-4aef-9c05-185d66f59ebb\") " pod="kube-system/global-pull-secret-syncer-4czqt" Apr 17 17:32:12.506218 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:12.506200 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4czqt" Apr 17 17:32:12.620126 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:12.620104 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4czqt"] Apr 17 17:32:12.622168 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:32:12.622141 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod203be660_71e2_4aef_9c05_185d66f59ebb.slice/crio-d7c06d3d5155256daf6995fb56a75ba5629c5585d2e4f24d76512f3a59a0dc3c WatchSource:0}: Error finding container d7c06d3d5155256daf6995fb56a75ba5629c5585d2e4f24d76512f3a59a0dc3c: Status 404 returned error can't find the container with id d7c06d3d5155256daf6995fb56a75ba5629c5585d2e4f24d76512f3a59a0dc3c Apr 17 17:32:12.623815 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:12.623799 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:32:12.934335 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:12.934255 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4czqt" event={"ID":"203be660-71e2-4aef-9c05-185d66f59ebb","Type":"ContainerStarted","Data":"d7c06d3d5155256daf6995fb56a75ba5629c5585d2e4f24d76512f3a59a0dc3c"} Apr 17 17:32:16.947646 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:16.947564 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4czqt" event={"ID":"203be660-71e2-4aef-9c05-185d66f59ebb","Type":"ContainerStarted","Data":"51efb35e36bace5d5838d2593a4d54044ad5c8579697eb58a84f9b0805606b95"} Apr 17 17:32:16.964518 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:32:16.964476 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-4czqt" podStartSLOduration=1.039974392 podStartE2EDuration="4.964462253s" podCreationTimestamp="2026-04-17 17:32:12 +0000 UTC" firstStartedPulling="2026-04-17 17:32:12.62393407 +0000 UTC m=+349.297787383" lastFinishedPulling="2026-04-17 17:32:16.548421926 +0000 UTC m=+353.222275244" observedRunningTime="2026-04-17 17:32:16.963292511 +0000 UTC m=+353.637145849" watchObservedRunningTime="2026-04-17 17:32:16.964462253 +0000 UTC m=+353.638315590" Apr 17 17:34:13.743267 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:13.743231 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-hpsp2"] Apr 17 17:34:13.746388 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:13.746370 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-hpsp2" Apr 17 17:34:13.749050 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:13.749030 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 17:34:13.750155 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:13.750140 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-vc8tk\"" Apr 17 17:34:13.750236 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:13.750198 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 17:34:13.753758 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:13.753738 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-hpsp2"] Apr 17 17:34:13.814097 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:13.814062 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds99l\" (UniqueName: \"kubernetes.io/projected/d8599535-f01b-4d82-a260-72279c2a92d7-kube-api-access-ds99l\") pod \"cert-manager-759f64656b-hpsp2\" (UID: \"d8599535-f01b-4d82-a260-72279c2a92d7\") " pod="cert-manager/cert-manager-759f64656b-hpsp2" Apr 17 17:34:13.814097 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:13.814098 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8599535-f01b-4d82-a260-72279c2a92d7-bound-sa-token\") pod \"cert-manager-759f64656b-hpsp2\" (UID: \"d8599535-f01b-4d82-a260-72279c2a92d7\") " pod="cert-manager/cert-manager-759f64656b-hpsp2" Apr 17 17:34:13.914589 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:13.914562 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ds99l\" (UniqueName: \"kubernetes.io/projected/d8599535-f01b-4d82-a260-72279c2a92d7-kube-api-access-ds99l\") pod \"cert-manager-759f64656b-hpsp2\" (UID: \"d8599535-f01b-4d82-a260-72279c2a92d7\") " pod="cert-manager/cert-manager-759f64656b-hpsp2" Apr 17 17:34:13.914589 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:13.914592 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8599535-f01b-4d82-a260-72279c2a92d7-bound-sa-token\") pod \"cert-manager-759f64656b-hpsp2\" (UID: \"d8599535-f01b-4d82-a260-72279c2a92d7\") " pod="cert-manager/cert-manager-759f64656b-hpsp2" Apr 17 17:34:13.923747 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:13.923686 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8599535-f01b-4d82-a260-72279c2a92d7-bound-sa-token\") pod \"cert-manager-759f64656b-hpsp2\" (UID: \"d8599535-f01b-4d82-a260-72279c2a92d7\") " pod="cert-manager/cert-manager-759f64656b-hpsp2" Apr 17 17:34:13.923882 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:13.923833 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds99l\" (UniqueName: \"kubernetes.io/projected/d8599535-f01b-4d82-a260-72279c2a92d7-kube-api-access-ds99l\") pod \"cert-manager-759f64656b-hpsp2\" (UID: \"d8599535-f01b-4d82-a260-72279c2a92d7\") " pod="cert-manager/cert-manager-759f64656b-hpsp2" Apr 17 17:34:14.055868 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:14.055795 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-hpsp2" Apr 17 17:34:14.174225 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:14.174203 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-hpsp2"] Apr 17 17:34:14.176230 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:34:14.176198 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8599535_f01b_4d82_a260_72279c2a92d7.slice/crio-0d948522b213cdf195e6f90cc62a3eed643ce70b06f56225f6f4ff84303d7e51 WatchSource:0}: Error finding container 0d948522b213cdf195e6f90cc62a3eed643ce70b06f56225f6f4ff84303d7e51: Status 404 returned error can't find the container with id 0d948522b213cdf195e6f90cc62a3eed643ce70b06f56225f6f4ff84303d7e51 Apr 17 17:34:14.275244 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:14.275205 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-hpsp2" event={"ID":"d8599535-f01b-4d82-a260-72279c2a92d7","Type":"ContainerStarted","Data":"0d948522b213cdf195e6f90cc62a3eed643ce70b06f56225f6f4ff84303d7e51"} Apr 17 17:34:18.287583 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:18.287550 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-hpsp2" event={"ID":"d8599535-f01b-4d82-a260-72279c2a92d7","Type":"ContainerStarted","Data":"7994ec4d544788b7970a6c508ec846796702633fba40111503c0f3165f39fdb8"} Apr 17 17:34:18.305947 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:18.305857 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-hpsp2" podStartSLOduration=1.444211256 podStartE2EDuration="5.305837315s" podCreationTimestamp="2026-04-17 17:34:13 +0000 UTC" firstStartedPulling="2026-04-17 17:34:14.177950008 +0000 UTC m=+470.851803329" lastFinishedPulling="2026-04-17 17:34:18.039576074 +0000 UTC m=+474.713429388" observedRunningTime="2026-04-17 17:34:18.305154423 +0000 UTC m=+474.979007762" watchObservedRunningTime="2026-04-17 17:34:18.305837315 +0000 UTC m=+474.979690650" Apr 17 17:34:22.413453 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.413417 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5"] Apr 17 17:34:22.416899 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.416881 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" Apr 17 17:34:22.420319 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.420287 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 17:34:22.420319 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.420318 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 17:34:22.420491 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.420297 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:34:22.421975 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.421959 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-9985d\"" Apr 17 17:34:22.422101 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.422078 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 17:34:22.422177 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.422079 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 17:34:22.431432 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.431405 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5"] Apr 17 17:34:22.589527 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.589495 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6318e701-9cfb-4ba7-a8ea-74909ddba5dd-metrics-cert\") pod \"lws-controller-manager-6f45766749-wnjd5\" (UID: \"6318e701-9cfb-4ba7-a8ea-74909ddba5dd\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" Apr 17 17:34:22.589527 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.589532 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6318e701-9cfb-4ba7-a8ea-74909ddba5dd-manager-config\") pod \"lws-controller-manager-6f45766749-wnjd5\" (UID: \"6318e701-9cfb-4ba7-a8ea-74909ddba5dd\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" Apr 17 17:34:22.589750 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.589557 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6318e701-9cfb-4ba7-a8ea-74909ddba5dd-cert\") pod \"lws-controller-manager-6f45766749-wnjd5\" (UID: \"6318e701-9cfb-4ba7-a8ea-74909ddba5dd\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" Apr 17 17:34:22.589750 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.589628 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6fwc\" (UniqueName: \"kubernetes.io/projected/6318e701-9cfb-4ba7-a8ea-74909ddba5dd-kube-api-access-n6fwc\") pod \"lws-controller-manager-6f45766749-wnjd5\" (UID: \"6318e701-9cfb-4ba7-a8ea-74909ddba5dd\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" Apr 17 17:34:22.690805 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.690729 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6318e701-9cfb-4ba7-a8ea-74909ddba5dd-metrics-cert\") pod \"lws-controller-manager-6f45766749-wnjd5\" (UID: \"6318e701-9cfb-4ba7-a8ea-74909ddba5dd\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" Apr 17 17:34:22.690805 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.690773 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6318e701-9cfb-4ba7-a8ea-74909ddba5dd-manager-config\") pod \"lws-controller-manager-6f45766749-wnjd5\" (UID: \"6318e701-9cfb-4ba7-a8ea-74909ddba5dd\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" Apr 17 17:34:22.690805 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.690805 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6318e701-9cfb-4ba7-a8ea-74909ddba5dd-cert\") pod \"lws-controller-manager-6f45766749-wnjd5\" (UID: \"6318e701-9cfb-4ba7-a8ea-74909ddba5dd\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" Apr 17 17:34:22.691010 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.690909 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6fwc\" (UniqueName: \"kubernetes.io/projected/6318e701-9cfb-4ba7-a8ea-74909ddba5dd-kube-api-access-n6fwc\") pod \"lws-controller-manager-6f45766749-wnjd5\" (UID: \"6318e701-9cfb-4ba7-a8ea-74909ddba5dd\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" Apr 17 17:34:22.691498 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.691470 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6318e701-9cfb-4ba7-a8ea-74909ddba5dd-manager-config\") pod \"lws-controller-manager-6f45766749-wnjd5\" (UID: \"6318e701-9cfb-4ba7-a8ea-74909ddba5dd\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" Apr 17 17:34:22.693142 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.693123 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6318e701-9cfb-4ba7-a8ea-74909ddba5dd-metrics-cert\") pod \"lws-controller-manager-6f45766749-wnjd5\" (UID: \"6318e701-9cfb-4ba7-a8ea-74909ddba5dd\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" Apr 17 17:34:22.693235 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.693218 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6318e701-9cfb-4ba7-a8ea-74909ddba5dd-cert\") pod \"lws-controller-manager-6f45766749-wnjd5\" (UID: \"6318e701-9cfb-4ba7-a8ea-74909ddba5dd\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" Apr 17 17:34:22.701001 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.700973 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6fwc\" (UniqueName: \"kubernetes.io/projected/6318e701-9cfb-4ba7-a8ea-74909ddba5dd-kube-api-access-n6fwc\") pod \"lws-controller-manager-6f45766749-wnjd5\" (UID: \"6318e701-9cfb-4ba7-a8ea-74909ddba5dd\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" Apr 17 17:34:22.726021 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.725983 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" Apr 17 17:34:22.848352 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:22.848295 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5"] Apr 17 17:34:22.850877 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:34:22.850844 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6318e701_9cfb_4ba7_a8ea_74909ddba5dd.slice/crio-f5b661208743264ea7ef8970e49b33492fa51b448d10d93d33b33f7ad0eefc99 WatchSource:0}: Error finding container f5b661208743264ea7ef8970e49b33492fa51b448d10d93d33b33f7ad0eefc99: Status 404 returned error can't find the container with id f5b661208743264ea7ef8970e49b33492fa51b448d10d93d33b33f7ad0eefc99 Apr 17 17:34:23.304434 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:23.304399 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" event={"ID":"6318e701-9cfb-4ba7-a8ea-74909ddba5dd","Type":"ContainerStarted","Data":"f5b661208743264ea7ef8970e49b33492fa51b448d10d93d33b33f7ad0eefc99"} Apr 17 17:34:27.316892 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:27.316852 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" event={"ID":"6318e701-9cfb-4ba7-a8ea-74909ddba5dd","Type":"ContainerStarted","Data":"dcef4e6afca4d8dbf4d54dd6201692d97c153e2317d37281866a21eab07e3964"} Apr 17 17:34:27.317276 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:27.316913 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" Apr 17 17:34:27.343918 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:27.343873 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" podStartSLOduration=1.641370684 podStartE2EDuration="5.343858763s" podCreationTimestamp="2026-04-17 17:34:22 +0000 UTC" firstStartedPulling="2026-04-17 17:34:22.852657656 +0000 UTC m=+479.526510970" lastFinishedPulling="2026-04-17 17:34:26.555145735 +0000 UTC m=+483.228999049" observedRunningTime="2026-04-17 17:34:27.342139612 +0000 UTC m=+484.015992949" watchObservedRunningTime="2026-04-17 17:34:27.343858763 +0000 UTC m=+484.017712099" Apr 17 17:34:27.930111 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:27.930073 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj"] Apr 17 17:34:27.933344 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:27.933323 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj" Apr 17 17:34:27.936494 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:27.936466 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 17:34:27.936600 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:27.936479 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-pdn5v\"" Apr 17 17:34:27.936735 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:27.936720 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 17:34:27.936935 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:27.936912 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 17:34:27.936992 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:27.936957 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 17:34:27.944153 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:27.944131 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj"] Apr 17 17:34:28.036397 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:28.036348 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l6z8\" (UniqueName: \"kubernetes.io/projected/72f44ff3-707a-444a-bfc9-5dd333f3568a-kube-api-access-9l6z8\") pod \"opendatahub-operator-controller-manager-77fb85d776-wqcvj\" (UID: \"72f44ff3-707a-444a-bfc9-5dd333f3568a\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj" Apr 17 17:34:28.036574 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:28.036418 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72f44ff3-707a-444a-bfc9-5dd333f3568a-webhook-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-wqcvj\" (UID: \"72f44ff3-707a-444a-bfc9-5dd333f3568a\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj" Apr 17 17:34:28.036574 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:28.036475 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72f44ff3-707a-444a-bfc9-5dd333f3568a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-wqcvj\" (UID: \"72f44ff3-707a-444a-bfc9-5dd333f3568a\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj" Apr 17 17:34:28.137891 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:28.137844 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72f44ff3-707a-444a-bfc9-5dd333f3568a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-wqcvj\" (UID: \"72f44ff3-707a-444a-bfc9-5dd333f3568a\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj" Apr 17 17:34:28.138060 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:28.137934 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l6z8\" (UniqueName: \"kubernetes.io/projected/72f44ff3-707a-444a-bfc9-5dd333f3568a-kube-api-access-9l6z8\") pod \"opendatahub-operator-controller-manager-77fb85d776-wqcvj\" (UID: \"72f44ff3-707a-444a-bfc9-5dd333f3568a\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj" Apr 17 17:34:28.138060 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:28.137980 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72f44ff3-707a-444a-bfc9-5dd333f3568a-webhook-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-wqcvj\" (UID: \"72f44ff3-707a-444a-bfc9-5dd333f3568a\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj" Apr 17 17:34:28.140412 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:28.140382 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72f44ff3-707a-444a-bfc9-5dd333f3568a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-wqcvj\" (UID: \"72f44ff3-707a-444a-bfc9-5dd333f3568a\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj" Apr 17 17:34:28.140560 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:28.140535 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72f44ff3-707a-444a-bfc9-5dd333f3568a-webhook-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-wqcvj\" (UID: \"72f44ff3-707a-444a-bfc9-5dd333f3568a\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj" Apr 17 17:34:28.165330 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:28.165304 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l6z8\" (UniqueName: \"kubernetes.io/projected/72f44ff3-707a-444a-bfc9-5dd333f3568a-kube-api-access-9l6z8\") pod \"opendatahub-operator-controller-manager-77fb85d776-wqcvj\" (UID: \"72f44ff3-707a-444a-bfc9-5dd333f3568a\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj" Apr 17 17:34:28.244040 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:28.244009 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj" Apr 17 17:34:28.367683 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:28.367659 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj"] Apr 17 17:34:28.369458 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:34:28.369433 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72f44ff3_707a_444a_bfc9_5dd333f3568a.slice/crio-960007c7a78b90bf92330ff3ab5d7f8bcbd94d552ea54fd4de0c1d3304408fdd WatchSource:0}: Error finding container 960007c7a78b90bf92330ff3ab5d7f8bcbd94d552ea54fd4de0c1d3304408fdd: Status 404 returned error can't find the container with id 960007c7a78b90bf92330ff3ab5d7f8bcbd94d552ea54fd4de0c1d3304408fdd Apr 17 17:34:29.325235 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:29.325193 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj" event={"ID":"72f44ff3-707a-444a-bfc9-5dd333f3568a","Type":"ContainerStarted","Data":"960007c7a78b90bf92330ff3ab5d7f8bcbd94d552ea54fd4de0c1d3304408fdd"} Apr 17 17:34:31.335222 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:31.335180 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj" event={"ID":"72f44ff3-707a-444a-bfc9-5dd333f3568a","Type":"ContainerStarted","Data":"3da8b0deb415c3da7cf8daac22d1bd8707dd7b112841e214b3b1273070fbec09"} Apr 17 17:34:31.335685 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:31.335304 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj" Apr 17 17:34:31.367409 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:31.367358 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj" podStartSLOduration=1.993190011 podStartE2EDuration="4.367344766s" podCreationTimestamp="2026-04-17 17:34:27 +0000 UTC" firstStartedPulling="2026-04-17 17:34:28.371102562 +0000 UTC m=+485.044955877" lastFinishedPulling="2026-04-17 17:34:30.745257314 +0000 UTC m=+487.419110632" observedRunningTime="2026-04-17 17:34:31.364277883 +0000 UTC m=+488.038131219" watchObservedRunningTime="2026-04-17 17:34:31.367344766 +0000 UTC m=+488.041198101" Apr 17 17:34:38.323603 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:38.323571 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6f45766749-wnjd5" Apr 17 17:34:42.339863 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:34:42.339833 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-wqcvj" Apr 17 17:35:09.251654 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.251578 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds"] Apr 17 17:35:09.255038 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.255017 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.257830 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.257807 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 17:35:09.257943 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.257814 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 17:35:09.258013 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.257990 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-8ttv7\"" Apr 17 17:35:09.258062 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.258006 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 17:35:09.266259 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.266241 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds"] Apr 17 17:35:09.384230 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.384186 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d79f2435-2abc-482c-a314-824069d12177-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.384230 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.384229 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d79f2435-2abc-482c-a314-824069d12177-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.384484 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.384254 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d79f2435-2abc-482c-a314-824069d12177-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.384484 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.384320 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d79f2435-2abc-482c-a314-824069d12177-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.384484 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.384357 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d79f2435-2abc-482c-a314-824069d12177-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.384484 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.384426 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d79f2435-2abc-482c-a314-824069d12177-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.384484 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.384452 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d79f2435-2abc-482c-a314-824069d12177-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.384783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.384503 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nvg7\" (UniqueName: \"kubernetes.io/projected/d79f2435-2abc-482c-a314-824069d12177-kube-api-access-6nvg7\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.384783 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.384521 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d79f2435-2abc-482c-a314-824069d12177-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.485650 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.485620 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d79f2435-2abc-482c-a314-824069d12177-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.485842 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.485658 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d79f2435-2abc-482c-a314-824069d12177-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.485842 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.485688 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d79f2435-2abc-482c-a314-824069d12177-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.485842 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.485717 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d79f2435-2abc-482c-a314-824069d12177-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.485842 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.485746 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nvg7\" (UniqueName: \"kubernetes.io/projected/d79f2435-2abc-482c-a314-824069d12177-kube-api-access-6nvg7\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.485842 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.485763 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d79f2435-2abc-482c-a314-824069d12177-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.485842 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.485790 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d79f2435-2abc-482c-a314-824069d12177-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.485842 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.485822 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d79f2435-2abc-482c-a314-824069d12177-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.486192 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.485860 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d79f2435-2abc-482c-a314-824069d12177-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.486192 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.486094 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d79f2435-2abc-482c-a314-824069d12177-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.486192 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.486156 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d79f2435-2abc-482c-a314-824069d12177-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.486361 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.486289 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d79f2435-2abc-482c-a314-824069d12177-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.486546 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.486528 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d79f2435-2abc-482c-a314-824069d12177-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.486627 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.486532 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d79f2435-2abc-482c-a314-824069d12177-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.488049 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.488025 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d79f2435-2abc-482c-a314-824069d12177-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.488164 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.488145 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d79f2435-2abc-482c-a314-824069d12177-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.493787 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.493767 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d79f2435-2abc-482c-a314-824069d12177-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.494861 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.494837 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nvg7\" (UniqueName: \"kubernetes.io/projected/d79f2435-2abc-482c-a314-824069d12177-kube-api-access-6nvg7\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds\" (UID: \"d79f2435-2abc-482c-a314-824069d12177\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.565323 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.565265 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:09.685858 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:09.685836 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds"] Apr 17 17:35:09.688608 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:35:09.688581 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd79f2435_2abc_482c_a314_824069d12177.slice/crio-5d5eeb5fd33e02b6e7b07f83b8b517481304d19f65a7d9334b1ee5300c07ed0b WatchSource:0}: Error finding container 5d5eeb5fd33e02b6e7b07f83b8b517481304d19f65a7d9334b1ee5300c07ed0b: Status 404 returned error can't find the container with id 5d5eeb5fd33e02b6e7b07f83b8b517481304d19f65a7d9334b1ee5300c07ed0b Apr 17 17:35:10.460267 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:10.460231 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" event={"ID":"d79f2435-2abc-482c-a314-824069d12177","Type":"ContainerStarted","Data":"5d5eeb5fd33e02b6e7b07f83b8b517481304d19f65a7d9334b1ee5300c07ed0b"} Apr 17 17:35:12.625870 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:12.625825 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 17 17:35:12.626138 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:12.625911 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 17 17:35:12.626138 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:12.625938 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 17 17:35:13.471473 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:13.471442 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" event={"ID":"d79f2435-2abc-482c-a314-824069d12177","Type":"ContainerStarted","Data":"716d7665a07b615743ef8338072e3200dbca3da7e972cc710e5346c67715bb77"} Apr 17 17:35:13.498524 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:13.498473 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" podStartSLOduration=1.563685822 podStartE2EDuration="4.4984578s" podCreationTimestamp="2026-04-17 17:35:09 +0000 UTC" firstStartedPulling="2026-04-17 17:35:09.690761816 +0000 UTC m=+526.364615130" lastFinishedPulling="2026-04-17 17:35:12.625533791 +0000 UTC m=+529.299387108" observedRunningTime="2026-04-17 17:35:13.496946141 +0000 UTC m=+530.170799479" watchObservedRunningTime="2026-04-17 17:35:13.4984578 +0000 UTC m=+530.172311134" Apr 17 17:35:13.566395 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:13.566360 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:13.570954 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:13.570920 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:14.474962 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:14.474921 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:14.475855 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:14.475835 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds" Apr 17 17:35:46.036440 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.036396 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ccw5k"] Apr 17 17:35:46.046539 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.046509 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-ccw5k" Apr 17 17:35:46.048593 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.048564 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ccw5k"] Apr 17 17:35:46.049614 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.049587 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-9lqs2\"" Apr 17 17:35:46.050854 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.050829 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 17:35:46.051007 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.050829 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 17:35:46.067821 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.067784 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snmkv\" (UniqueName: \"kubernetes.io/projected/1c722ccf-7087-403a-9b9d-eedc187ea1cd-kube-api-access-snmkv\") pod \"kuadrant-operator-catalog-ccw5k\" (UID: \"1c722ccf-7087-403a-9b9d-eedc187ea1cd\") " pod="kuadrant-system/kuadrant-operator-catalog-ccw5k" Apr 17 17:35:46.168296 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.168263 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snmkv\" (UniqueName: \"kubernetes.io/projected/1c722ccf-7087-403a-9b9d-eedc187ea1cd-kube-api-access-snmkv\") pod \"kuadrant-operator-catalog-ccw5k\" (UID: \"1c722ccf-7087-403a-9b9d-eedc187ea1cd\") " pod="kuadrant-system/kuadrant-operator-catalog-ccw5k" Apr 17 17:35:46.177675 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.177651 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snmkv\" (UniqueName: \"kubernetes.io/projected/1c722ccf-7087-403a-9b9d-eedc187ea1cd-kube-api-access-snmkv\") pod \"kuadrant-operator-catalog-ccw5k\" (UID: \"1c722ccf-7087-403a-9b9d-eedc187ea1cd\") " pod="kuadrant-system/kuadrant-operator-catalog-ccw5k" Apr 17 17:35:46.358623 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.358523 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-ccw5k" Apr 17 17:35:46.381101 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.381062 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ccw5k"] Apr 17 17:35:46.485312 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.485286 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ccw5k"] Apr 17 17:35:46.486961 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:35:46.486935 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c722ccf_7087_403a_9b9d_eedc187ea1cd.slice/crio-f3c2f97e78dfb9b66401a94037489259917d21e465e620ea714c72484dfbcd42 WatchSource:0}: Error finding container f3c2f97e78dfb9b66401a94037489259917d21e465e620ea714c72484dfbcd42: Status 404 returned error can't find the container with id f3c2f97e78dfb9b66401a94037489259917d21e465e620ea714c72484dfbcd42 Apr 17 17:35:46.582601 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.582560 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-ccw5k" event={"ID":"1c722ccf-7087-403a-9b9d-eedc187ea1cd","Type":"ContainerStarted","Data":"f3c2f97e78dfb9b66401a94037489259917d21e465e620ea714c72484dfbcd42"} Apr 17 17:35:46.604644 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.604609 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-2z7gq"] Apr 17 17:35:46.610623 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.610567 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-2z7gq" Apr 17 17:35:46.626197 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.626168 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-2z7gq"] Apr 17 17:35:46.671281 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.671242 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q676x\" (UniqueName: \"kubernetes.io/projected/657343ce-d367-433d-95fc-af04f60b3050-kube-api-access-q676x\") pod \"kuadrant-operator-catalog-2z7gq\" (UID: \"657343ce-d367-433d-95fc-af04f60b3050\") " pod="kuadrant-system/kuadrant-operator-catalog-2z7gq" Apr 17 17:35:46.772435 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.772399 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q676x\" (UniqueName: \"kubernetes.io/projected/657343ce-d367-433d-95fc-af04f60b3050-kube-api-access-q676x\") pod \"kuadrant-operator-catalog-2z7gq\" (UID: \"657343ce-d367-433d-95fc-af04f60b3050\") " pod="kuadrant-system/kuadrant-operator-catalog-2z7gq" Apr 17 17:35:46.780810 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.780783 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q676x\" (UniqueName: \"kubernetes.io/projected/657343ce-d367-433d-95fc-af04f60b3050-kube-api-access-q676x\") pod \"kuadrant-operator-catalog-2z7gq\" (UID: \"657343ce-d367-433d-95fc-af04f60b3050\") " pod="kuadrant-system/kuadrant-operator-catalog-2z7gq" Apr 17 17:35:46.920021 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:46.919924 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-2z7gq" Apr 17 17:35:47.049938 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:47.049909 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-2z7gq"] Apr 17 17:35:47.051808 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:35:47.051772 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod657343ce_d367_433d_95fc_af04f60b3050.slice/crio-9aab7dc0b77a747cd09bf6cafa3e3f7c85b2796b5afbf69a2f87c951b1197907 WatchSource:0}: Error finding container 9aab7dc0b77a747cd09bf6cafa3e3f7c85b2796b5afbf69a2f87c951b1197907: Status 404 returned error can't find the container with id 9aab7dc0b77a747cd09bf6cafa3e3f7c85b2796b5afbf69a2f87c951b1197907 Apr 17 17:35:47.588819 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:47.588774 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-2z7gq" event={"ID":"657343ce-d367-433d-95fc-af04f60b3050","Type":"ContainerStarted","Data":"9aab7dc0b77a747cd09bf6cafa3e3f7c85b2796b5afbf69a2f87c951b1197907"} Apr 17 17:35:49.596113 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:49.596075 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-ccw5k" event={"ID":"1c722ccf-7087-403a-9b9d-eedc187ea1cd","Type":"ContainerStarted","Data":"fa43965c72caeebf83b7cc35ba8e2fcde4f57fb4072a83a7ec0db4d6c3d7d7cd"} Apr 17 17:35:49.596548 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:49.596156 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-ccw5k" podUID="1c722ccf-7087-403a-9b9d-eedc187ea1cd" containerName="registry-server" containerID="cri-o://fa43965c72caeebf83b7cc35ba8e2fcde4f57fb4072a83a7ec0db4d6c3d7d7cd" gracePeriod=2 Apr 17 17:35:49.597392 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:49.597369 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-2z7gq" event={"ID":"657343ce-d367-433d-95fc-af04f60b3050","Type":"ContainerStarted","Data":"c41228b6174ecba40fbc8bc37b791c1438618814c91ee315b87ec4291050f342"} Apr 17 17:35:49.613885 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:49.613836 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-ccw5k" podStartSLOduration=1.433483618 podStartE2EDuration="3.6138219s" podCreationTimestamp="2026-04-17 17:35:46 +0000 UTC" firstStartedPulling="2026-04-17 17:35:46.488237822 +0000 UTC m=+563.162091136" lastFinishedPulling="2026-04-17 17:35:48.668576105 +0000 UTC m=+565.342429418" observedRunningTime="2026-04-17 17:35:49.612331185 +0000 UTC m=+566.286184529" watchObservedRunningTime="2026-04-17 17:35:49.6138219 +0000 UTC m=+566.287675236" Apr 17 17:35:49.832302 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:49.832278 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-ccw5k" Apr 17 17:35:49.850259 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:49.850155 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-2z7gq" podStartSLOduration=2.19383488 podStartE2EDuration="3.850133243s" podCreationTimestamp="2026-04-17 17:35:46 +0000 UTC" firstStartedPulling="2026-04-17 17:35:47.053457235 +0000 UTC m=+563.727310548" lastFinishedPulling="2026-04-17 17:35:48.709755595 +0000 UTC m=+565.383608911" observedRunningTime="2026-04-17 17:35:49.626996694 +0000 UTC m=+566.300850030" watchObservedRunningTime="2026-04-17 17:35:49.850133243 +0000 UTC m=+566.523986579" Apr 17 17:35:49.898185 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:49.898148 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snmkv\" (UniqueName: \"kubernetes.io/projected/1c722ccf-7087-403a-9b9d-eedc187ea1cd-kube-api-access-snmkv\") pod \"1c722ccf-7087-403a-9b9d-eedc187ea1cd\" (UID: \"1c722ccf-7087-403a-9b9d-eedc187ea1cd\") " Apr 17 17:35:49.900476 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:49.900446 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c722ccf-7087-403a-9b9d-eedc187ea1cd-kube-api-access-snmkv" (OuterVolumeSpecName: "kube-api-access-snmkv") pod "1c722ccf-7087-403a-9b9d-eedc187ea1cd" (UID: "1c722ccf-7087-403a-9b9d-eedc187ea1cd"). InnerVolumeSpecName "kube-api-access-snmkv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:35:49.998731 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:49.998664 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-snmkv\" (UniqueName: \"kubernetes.io/projected/1c722ccf-7087-403a-9b9d-eedc187ea1cd-kube-api-access-snmkv\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:35:50.602091 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:50.602053 2571 generic.go:358] "Generic (PLEG): container finished" podID="1c722ccf-7087-403a-9b9d-eedc187ea1cd" containerID="fa43965c72caeebf83b7cc35ba8e2fcde4f57fb4072a83a7ec0db4d6c3d7d7cd" exitCode=0 Apr 17 17:35:50.602521 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:50.602119 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-ccw5k" Apr 17 17:35:50.602521 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:50.602145 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-ccw5k" event={"ID":"1c722ccf-7087-403a-9b9d-eedc187ea1cd","Type":"ContainerDied","Data":"fa43965c72caeebf83b7cc35ba8e2fcde4f57fb4072a83a7ec0db4d6c3d7d7cd"} Apr 17 17:35:50.602521 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:50.602179 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-ccw5k" event={"ID":"1c722ccf-7087-403a-9b9d-eedc187ea1cd","Type":"ContainerDied","Data":"f3c2f97e78dfb9b66401a94037489259917d21e465e620ea714c72484dfbcd42"} Apr 17 17:35:50.602521 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:50.602196 2571 scope.go:117] "RemoveContainer" containerID="fa43965c72caeebf83b7cc35ba8e2fcde4f57fb4072a83a7ec0db4d6c3d7d7cd" Apr 17 17:35:50.610489 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:50.610458 2571 scope.go:117] "RemoveContainer" containerID="fa43965c72caeebf83b7cc35ba8e2fcde4f57fb4072a83a7ec0db4d6c3d7d7cd" Apr 17 17:35:50.610780 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:35:50.610761 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa43965c72caeebf83b7cc35ba8e2fcde4f57fb4072a83a7ec0db4d6c3d7d7cd\": container with ID starting with fa43965c72caeebf83b7cc35ba8e2fcde4f57fb4072a83a7ec0db4d6c3d7d7cd not found: ID does not exist" containerID="fa43965c72caeebf83b7cc35ba8e2fcde4f57fb4072a83a7ec0db4d6c3d7d7cd" Apr 17 17:35:50.610860 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:50.610789 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa43965c72caeebf83b7cc35ba8e2fcde4f57fb4072a83a7ec0db4d6c3d7d7cd"} err="failed to get container status \"fa43965c72caeebf83b7cc35ba8e2fcde4f57fb4072a83a7ec0db4d6c3d7d7cd\": rpc error: code = NotFound desc = could not find container \"fa43965c72caeebf83b7cc35ba8e2fcde4f57fb4072a83a7ec0db4d6c3d7d7cd\": container with ID starting with fa43965c72caeebf83b7cc35ba8e2fcde4f57fb4072a83a7ec0db4d6c3d7d7cd not found: ID does not exist" Apr 17 17:35:50.619493 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:50.619467 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ccw5k"] Apr 17 17:35:50.620847 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:50.620821 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ccw5k"] Apr 17 17:35:51.917107 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:51.917074 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c722ccf-7087-403a-9b9d-eedc187ea1cd" path="/var/lib/kubelet/pods/1c722ccf-7087-403a-9b9d-eedc187ea1cd/volumes" Apr 17 17:35:56.920691 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:56.920656 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-2z7gq" Apr 17 17:35:56.920691 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:56.920715 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-2z7gq" Apr 17 17:35:56.942726 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:56.942685 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-2z7gq" Apr 17 17:35:57.646666 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:35:57.646639 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-2z7gq" Apr 17 17:36:17.516934 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:17.516899 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx"] Apr 17 17:36:17.517391 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:17.517214 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c722ccf-7087-403a-9b9d-eedc187ea1cd" containerName="registry-server" Apr 17 17:36:17.517391 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:17.517224 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c722ccf-7087-403a-9b9d-eedc187ea1cd" containerName="registry-server" Apr 17 17:36:17.517391 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:17.517301 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c722ccf-7087-403a-9b9d-eedc187ea1cd" containerName="registry-server" Apr 17 17:36:17.519863 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:17.519847 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx" Apr 17 17:36:17.522660 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:17.522637 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-rmmb8\"" Apr 17 17:36:17.534099 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:17.534075 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx"] Apr 17 17:36:17.629680 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:17.629645 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g5c4\" (UniqueName: \"kubernetes.io/projected/8287280d-10d7-4e94-a450-2fda255ea9ee-kube-api-access-4g5c4\") pod \"limitador-operator-controller-manager-85c4996f8c-vkjxx\" (UID: \"8287280d-10d7-4e94-a450-2fda255ea9ee\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx" Apr 17 17:36:17.730823 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:17.730781 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4g5c4\" (UniqueName: \"kubernetes.io/projected/8287280d-10d7-4e94-a450-2fda255ea9ee-kube-api-access-4g5c4\") pod \"limitador-operator-controller-manager-85c4996f8c-vkjxx\" (UID: \"8287280d-10d7-4e94-a450-2fda255ea9ee\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx" Apr 17 17:36:17.745661 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:17.745630 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g5c4\" (UniqueName: \"kubernetes.io/projected/8287280d-10d7-4e94-a450-2fda255ea9ee-kube-api-access-4g5c4\") pod \"limitador-operator-controller-manager-85c4996f8c-vkjxx\" (UID: \"8287280d-10d7-4e94-a450-2fda255ea9ee\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx" Apr 17 17:36:17.829071 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:17.828987 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx" Apr 17 17:36:17.956486 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:17.956452 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx"] Apr 17 17:36:18.697942 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:18.697900 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx" event={"ID":"8287280d-10d7-4e94-a450-2fda255ea9ee","Type":"ContainerStarted","Data":"30de1b83094ead8345c725e31a0a62f76f8f3bc54232a2df6136b7b7d8d50269"} Apr 17 17:36:20.708177 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:20.708137 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx" event={"ID":"8287280d-10d7-4e94-a450-2fda255ea9ee","Type":"ContainerStarted","Data":"46bb97d8261af7dfa433a0d9c7f739d17598a19fec21f1cfa5dc2ab65bd80678"} Apr 17 17:36:20.708540 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:20.708283 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx" Apr 17 17:36:20.727407 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:20.727354 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx" podStartSLOduration=1.946068562 podStartE2EDuration="3.727338867s" podCreationTimestamp="2026-04-17 17:36:17 +0000 UTC" firstStartedPulling="2026-04-17 17:36:17.96058517 +0000 UTC m=+594.634438498" lastFinishedPulling="2026-04-17 17:36:19.741855484 +0000 UTC m=+596.415708803" observedRunningTime="2026-04-17 17:36:20.725122796 +0000 UTC m=+597.398976145" watchObservedRunningTime="2026-04-17 17:36:20.727338867 +0000 UTC m=+597.401192259" Apr 17 17:36:25.016124 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:25.016091 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-7rbxr"] Apr 17 17:36:25.023821 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:25.023799 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7rbxr" Apr 17 17:36:25.027409 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:25.027383 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 17:36:25.027519 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:25.027413 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-qqd8p\"" Apr 17 17:36:25.035747 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:25.035725 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-7rbxr"] Apr 17 17:36:25.097322 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:25.097285 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxxg2\" (UniqueName: \"kubernetes.io/projected/05903816-57b3-4c2f-a474-39431e9e4315-kube-api-access-bxxg2\") pod \"dns-operator-controller-manager-648d5c98bc-7rbxr\" (UID: \"05903816-57b3-4c2f-a474-39431e9e4315\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7rbxr" Apr 17 17:36:25.198688 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:25.198652 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxxg2\" (UniqueName: \"kubernetes.io/projected/05903816-57b3-4c2f-a474-39431e9e4315-kube-api-access-bxxg2\") pod \"dns-operator-controller-manager-648d5c98bc-7rbxr\" (UID: \"05903816-57b3-4c2f-a474-39431e9e4315\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7rbxr" Apr 17 17:36:25.209391 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:25.209360 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxxg2\" (UniqueName: \"kubernetes.io/projected/05903816-57b3-4c2f-a474-39431e9e4315-kube-api-access-bxxg2\") pod \"dns-operator-controller-manager-648d5c98bc-7rbxr\" (UID: \"05903816-57b3-4c2f-a474-39431e9e4315\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7rbxr" Apr 17 17:36:25.334349 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:25.334247 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7rbxr" Apr 17 17:36:25.461648 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:25.461607 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-7rbxr"] Apr 17 17:36:25.464561 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:36:25.464534 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05903816_57b3_4c2f_a474_39431e9e4315.slice/crio-8b864251a5940001eb638ae8c871745ecb125362c63cc4e8b3cc311c8b882b40 WatchSource:0}: Error finding container 8b864251a5940001eb638ae8c871745ecb125362c63cc4e8b3cc311c8b882b40: Status 404 returned error can't find the container with id 8b864251a5940001eb638ae8c871745ecb125362c63cc4e8b3cc311c8b882b40 Apr 17 17:36:25.725089 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:25.725055 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7rbxr" event={"ID":"05903816-57b3-4c2f-a474-39431e9e4315","Type":"ContainerStarted","Data":"8b864251a5940001eb638ae8c871745ecb125362c63cc4e8b3cc311c8b882b40"} Apr 17 17:36:27.732942 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:27.732902 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7rbxr" event={"ID":"05903816-57b3-4c2f-a474-39431e9e4315","Type":"ContainerStarted","Data":"83993350e3d51fdcbcd2597d6265be1278953448ac06fc57908b2baa6277be53"} Apr 17 17:36:27.733386 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:27.733047 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7rbxr" Apr 17 17:36:27.756372 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:27.756311 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7rbxr" podStartSLOduration=1.602377422 podStartE2EDuration="3.756293043s" podCreationTimestamp="2026-04-17 17:36:24 +0000 UTC" firstStartedPulling="2026-04-17 17:36:25.466851299 +0000 UTC m=+602.140704612" lastFinishedPulling="2026-04-17 17:36:27.620766915 +0000 UTC m=+604.294620233" observedRunningTime="2026-04-17 17:36:27.755019818 +0000 UTC m=+604.428873154" watchObservedRunningTime="2026-04-17 17:36:27.756293043 +0000 UTC m=+604.430146382" Apr 17 17:36:29.412387 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:29.412347 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt"] Apr 17 17:36:29.421232 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:29.421186 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" Apr 17 17:36:29.424283 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:29.424258 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-tx4w5\"" Apr 17 17:36:29.431204 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:29.431174 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt"] Apr 17 17:36:29.537122 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:29.537069 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q87fn\" (UniqueName: \"kubernetes.io/projected/6018dabf-5e25-458c-bebe-95770805b360-kube-api-access-q87fn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vcdzt\" (UID: \"6018dabf-5e25-458c-bebe-95770805b360\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" Apr 17 17:36:29.537319 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:29.537178 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6018dabf-5e25-458c-bebe-95770805b360-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vcdzt\" (UID: \"6018dabf-5e25-458c-bebe-95770805b360\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" Apr 17 17:36:29.638588 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:29.638547 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6018dabf-5e25-458c-bebe-95770805b360-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vcdzt\" (UID: \"6018dabf-5e25-458c-bebe-95770805b360\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" Apr 17 17:36:29.638816 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:29.638748 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q87fn\" (UniqueName: \"kubernetes.io/projected/6018dabf-5e25-458c-bebe-95770805b360-kube-api-access-q87fn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vcdzt\" (UID: \"6018dabf-5e25-458c-bebe-95770805b360\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" Apr 17 17:36:29.639045 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:29.639014 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6018dabf-5e25-458c-bebe-95770805b360-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vcdzt\" (UID: \"6018dabf-5e25-458c-bebe-95770805b360\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" Apr 17 17:36:29.647544 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:29.647516 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q87fn\" (UniqueName: \"kubernetes.io/projected/6018dabf-5e25-458c-bebe-95770805b360-kube-api-access-q87fn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vcdzt\" (UID: \"6018dabf-5e25-458c-bebe-95770805b360\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" Apr 17 17:36:29.732498 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:29.732459 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" Apr 17 17:36:29.862180 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:29.862156 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt"] Apr 17 17:36:29.864896 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:36:29.864860 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6018dabf_5e25_458c_bebe_95770805b360.slice/crio-7c305140cfa946140ae43e5c207fa5e44f421ab7f14d991a6661c58afd08ffd3 WatchSource:0}: Error finding container 7c305140cfa946140ae43e5c207fa5e44f421ab7f14d991a6661c58afd08ffd3: Status 404 returned error can't find the container with id 7c305140cfa946140ae43e5c207fa5e44f421ab7f14d991a6661c58afd08ffd3 Apr 17 17:36:30.745064 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:30.745028 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" event={"ID":"6018dabf-5e25-458c-bebe-95770805b360","Type":"ContainerStarted","Data":"7c305140cfa946140ae43e5c207fa5e44f421ab7f14d991a6661c58afd08ffd3"} Apr 17 17:36:31.713745 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:31.713687 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx" Apr 17 17:36:35.761998 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:35.761960 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" event={"ID":"6018dabf-5e25-458c-bebe-95770805b360","Type":"ContainerStarted","Data":"6cccfcaee1e4516999ed7adb47a1319fd9eebfcd90c4f9185947286a99a2d281"} Apr 17 17:36:35.762431 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:35.762126 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" Apr 17 17:36:35.781112 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:35.781056 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" podStartSLOduration=1.8489702879999999 podStartE2EDuration="6.781040584s" podCreationTimestamp="2026-04-17 17:36:29 +0000 UTC" firstStartedPulling="2026-04-17 17:36:29.867225623 +0000 UTC m=+606.541078938" lastFinishedPulling="2026-04-17 17:36:34.79929592 +0000 UTC m=+611.473149234" observedRunningTime="2026-04-17 17:36:35.779623685 +0000 UTC m=+612.453477032" watchObservedRunningTime="2026-04-17 17:36:35.781040584 +0000 UTC m=+612.454893922" Apr 17 17:36:38.738794 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:38.738763 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7rbxr" Apr 17 17:36:46.768075 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:46.768040 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" Apr 17 17:36:47.764405 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.764365 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt"] Apr 17 17:36:47.764677 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.764652 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" podUID="6018dabf-5e25-458c-bebe-95770805b360" containerName="manager" containerID="cri-o://6cccfcaee1e4516999ed7adb47a1319fd9eebfcd90c4f9185947286a99a2d281" gracePeriod=2 Apr 17 17:36:47.767636 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.767582 2571 status_manager.go:895] "Failed to get status for pod" podUID="6018dabf-5e25-458c-bebe-95770805b360" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vcdzt\" is forbidden: User \"system:node:ip-10-0-138-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-42.ec2.internal' and this object" Apr 17 17:36:47.768816 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.768785 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt"] Apr 17 17:36:47.783237 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.783214 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx"] Apr 17 17:36:47.783511 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.783488 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx" podUID="8287280d-10d7-4e94-a450-2fda255ea9ee" containerName="manager" containerID="cri-o://46bb97d8261af7dfa433a0d9c7f739d17598a19fec21f1cfa5dc2ab65bd80678" gracePeriod=2 Apr 17 17:36:47.785907 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.785876 2571 status_manager.go:895] "Failed to get status for pod" podUID="6018dabf-5e25-458c-bebe-95770805b360" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vcdzt\" is forbidden: User \"system:node:ip-10-0-138-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-42.ec2.internal' and this object" Apr 17 17:36:47.801416 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.801393 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx"] Apr 17 17:36:47.808297 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.808270 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6"] Apr 17 17:36:47.808817 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.808800 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6018dabf-5e25-458c-bebe-95770805b360" containerName="manager" Apr 17 17:36:47.808886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.808819 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6018dabf-5e25-458c-bebe-95770805b360" containerName="manager" Apr 17 17:36:47.808886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.808839 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8287280d-10d7-4e94-a450-2fda255ea9ee" containerName="manager" Apr 17 17:36:47.808886 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.808850 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8287280d-10d7-4e94-a450-2fda255ea9ee" containerName="manager" Apr 17 17:36:47.808979 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.808929 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="8287280d-10d7-4e94-a450-2fda255ea9ee" containerName="manager" Apr 17 17:36:47.808979 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.808943 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6018dabf-5e25-458c-bebe-95770805b360" containerName="manager" Apr 17 17:36:47.811963 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.811947 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" Apr 17 17:36:47.814399 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.814369 2571 status_manager.go:895] "Failed to get status for pod" podUID="8287280d-10d7-4e94-a450-2fda255ea9ee" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx" err="pods \"limitador-operator-controller-manager-85c4996f8c-vkjxx\" is forbidden: User \"system:node:ip-10-0-138-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-42.ec2.internal' and this object" Apr 17 17:36:47.816478 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.816454 2571 status_manager.go:895] "Failed to get status for pod" podUID="6018dabf-5e25-458c-bebe-95770805b360" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vcdzt\" is forbidden: User \"system:node:ip-10-0-138-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-42.ec2.internal' and this object" Apr 17 17:36:47.820804 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.820780 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z2xtr"] Apr 17 17:36:47.823772 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.823755 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z2xtr" Apr 17 17:36:47.826402 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.826382 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6"] Apr 17 17:36:47.840112 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.840091 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z2xtr"] Apr 17 17:36:47.848564 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.848536 2571 status_manager.go:895] "Failed to get status for pod" podUID="8287280d-10d7-4e94-a450-2fda255ea9ee" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx" err="pods \"limitador-operator-controller-manager-85c4996f8c-vkjxx\" is forbidden: User \"system:node:ip-10-0-138-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-42.ec2.internal' and this object" Apr 17 17:36:47.872687 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.872659 2571 status_manager.go:895] "Failed to get status for pod" podUID="6018dabf-5e25-458c-bebe-95770805b360" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vcdzt\" is forbidden: User \"system:node:ip-10-0-138-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-42.ec2.internal' and this object" Apr 17 17:36:47.897577 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.897553 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jnw8\" (UniqueName: \"kubernetes.io/projected/875cebba-fbc2-4622-a3d9-91348867ab47-kube-api-access-9jnw8\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d8vb6\" (UID: \"875cebba-fbc2-4622-a3d9-91348867ab47\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" Apr 17 17:36:47.897675 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.897591 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jjnh\" (UniqueName: \"kubernetes.io/projected/4db4bee4-0928-4b25-bae2-4ceac22bdf41-kube-api-access-7jjnh\") pod \"limitador-operator-controller-manager-85c4996f8c-z2xtr\" (UID: \"4db4bee4-0928-4b25-bae2-4ceac22bdf41\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z2xtr" Apr 17 17:36:47.897675 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:47.897610 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/875cebba-fbc2-4622-a3d9-91348867ab47-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d8vb6\" (UID: \"875cebba-fbc2-4622-a3d9-91348867ab47\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" Apr 17 17:36:48.000401 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.000372 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jnw8\" (UniqueName: \"kubernetes.io/projected/875cebba-fbc2-4622-a3d9-91348867ab47-kube-api-access-9jnw8\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d8vb6\" (UID: \"875cebba-fbc2-4622-a3d9-91348867ab47\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" Apr 17 17:36:48.000468 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.000423 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jjnh\" (UniqueName: \"kubernetes.io/projected/4db4bee4-0928-4b25-bae2-4ceac22bdf41-kube-api-access-7jjnh\") pod \"limitador-operator-controller-manager-85c4996f8c-z2xtr\" (UID: \"4db4bee4-0928-4b25-bae2-4ceac22bdf41\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z2xtr" Apr 17 17:36:48.000468 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.000446 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/875cebba-fbc2-4622-a3d9-91348867ab47-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d8vb6\" (UID: \"875cebba-fbc2-4622-a3d9-91348867ab47\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" Apr 17 17:36:48.000836 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.000820 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/875cebba-fbc2-4622-a3d9-91348867ab47-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d8vb6\" (UID: \"875cebba-fbc2-4622-a3d9-91348867ab47\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" Apr 17 17:36:48.004683 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.004663 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" Apr 17 17:36:48.010020 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.010001 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jnw8\" (UniqueName: \"kubernetes.io/projected/875cebba-fbc2-4622-a3d9-91348867ab47-kube-api-access-9jnw8\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d8vb6\" (UID: \"875cebba-fbc2-4622-a3d9-91348867ab47\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" Apr 17 17:36:48.010173 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.010149 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jjnh\" (UniqueName: \"kubernetes.io/projected/4db4bee4-0928-4b25-bae2-4ceac22bdf41-kube-api-access-7jjnh\") pod \"limitador-operator-controller-manager-85c4996f8c-z2xtr\" (UID: \"4db4bee4-0928-4b25-bae2-4ceac22bdf41\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z2xtr" Apr 17 17:36:48.017667 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.017624 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx" Apr 17 17:36:48.101052 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.101026 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6018dabf-5e25-458c-bebe-95770805b360-extensions-socket-volume\") pod \"6018dabf-5e25-458c-bebe-95770805b360\" (UID: \"6018dabf-5e25-458c-bebe-95770805b360\") " Apr 17 17:36:48.101163 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.101102 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g5c4\" (UniqueName: \"kubernetes.io/projected/8287280d-10d7-4e94-a450-2fda255ea9ee-kube-api-access-4g5c4\") pod \"8287280d-10d7-4e94-a450-2fda255ea9ee\" (UID: \"8287280d-10d7-4e94-a450-2fda255ea9ee\") " Apr 17 17:36:48.101163 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.101122 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q87fn\" (UniqueName: \"kubernetes.io/projected/6018dabf-5e25-458c-bebe-95770805b360-kube-api-access-q87fn\") pod \"6018dabf-5e25-458c-bebe-95770805b360\" (UID: \"6018dabf-5e25-458c-bebe-95770805b360\") " Apr 17 17:36:48.101470 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.101449 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6018dabf-5e25-458c-bebe-95770805b360-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "6018dabf-5e25-458c-bebe-95770805b360" (UID: "6018dabf-5e25-458c-bebe-95770805b360"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:48.103008 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.102989 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8287280d-10d7-4e94-a450-2fda255ea9ee-kube-api-access-4g5c4" (OuterVolumeSpecName: "kube-api-access-4g5c4") pod "8287280d-10d7-4e94-a450-2fda255ea9ee" (UID: "8287280d-10d7-4e94-a450-2fda255ea9ee"). InnerVolumeSpecName "kube-api-access-4g5c4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:36:48.103083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.103064 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6018dabf-5e25-458c-bebe-95770805b360-kube-api-access-q87fn" (OuterVolumeSpecName: "kube-api-access-q87fn") pod "6018dabf-5e25-458c-bebe-95770805b360" (UID: "6018dabf-5e25-458c-bebe-95770805b360"). InnerVolumeSpecName "kube-api-access-q87fn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:36:48.201895 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.201874 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4g5c4\" (UniqueName: \"kubernetes.io/projected/8287280d-10d7-4e94-a450-2fda255ea9ee-kube-api-access-4g5c4\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:36:48.201895 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.201893 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q87fn\" (UniqueName: \"kubernetes.io/projected/6018dabf-5e25-458c-bebe-95770805b360-kube-api-access-q87fn\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:36:48.202015 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.201902 2571 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6018dabf-5e25-458c-bebe-95770805b360-extensions-socket-volume\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:36:48.206811 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.206792 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" Apr 17 17:36:48.212533 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.212519 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z2xtr" Apr 17 17:36:48.333276 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.333092 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6"] Apr 17 17:36:48.336081 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:36:48.336052 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod875cebba_fbc2_4622_a3d9_91348867ab47.slice/crio-2d4328ddc20fcebd693b6804b3be8b0024428cc1c3b9dd5e1902b2a4d196565c WatchSource:0}: Error finding container 2d4328ddc20fcebd693b6804b3be8b0024428cc1c3b9dd5e1902b2a4d196565c: Status 404 returned error can't find the container with id 2d4328ddc20fcebd693b6804b3be8b0024428cc1c3b9dd5e1902b2a4d196565c Apr 17 17:36:48.353122 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.353090 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z2xtr"] Apr 17 17:36:48.355112 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:36:48.355086 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4db4bee4_0928_4b25_bae2_4ceac22bdf41.slice/crio-600ecce289436af568630248d7ca3aacce757429812ff75df52a1a32639a5a18 WatchSource:0}: Error finding container 600ecce289436af568630248d7ca3aacce757429812ff75df52a1a32639a5a18: Status 404 returned error can't find the container with id 600ecce289436af568630248d7ca3aacce757429812ff75df52a1a32639a5a18 Apr 17 17:36:48.807050 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.807017 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z2xtr" event={"ID":"4db4bee4-0928-4b25-bae2-4ceac22bdf41","Type":"ContainerStarted","Data":"5016ec16342f9dd0733131706def54570e28cb227a9e9d161751be48ab38c85e"} Apr 17 17:36:48.807442 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.807057 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z2xtr" event={"ID":"4db4bee4-0928-4b25-bae2-4ceac22bdf41","Type":"ContainerStarted","Data":"600ecce289436af568630248d7ca3aacce757429812ff75df52a1a32639a5a18"} Apr 17 17:36:48.807442 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.807114 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z2xtr" Apr 17 17:36:48.808146 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.808120 2571 generic.go:358] "Generic (PLEG): container finished" podID="8287280d-10d7-4e94-a450-2fda255ea9ee" containerID="46bb97d8261af7dfa433a0d9c7f739d17598a19fec21f1cfa5dc2ab65bd80678" exitCode=0 Apr 17 17:36:48.808256 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.808160 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vkjxx" Apr 17 17:36:48.808256 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.808197 2571 scope.go:117] "RemoveContainer" containerID="46bb97d8261af7dfa433a0d9c7f739d17598a19fec21f1cfa5dc2ab65bd80678" Apr 17 17:36:48.809557 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.809536 2571 generic.go:358] "Generic (PLEG): container finished" podID="6018dabf-5e25-458c-bebe-95770805b360" containerID="6cccfcaee1e4516999ed7adb47a1319fd9eebfcd90c4f9185947286a99a2d281" exitCode=0 Apr 17 17:36:48.809656 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.809607 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vcdzt" Apr 17 17:36:48.811160 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.811141 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" event={"ID":"875cebba-fbc2-4622-a3d9-91348867ab47","Type":"ContainerStarted","Data":"9a8f1c8daf81bc68d023ab172c151e608ab8be1e4eb3b877e331247c97fb1b8b"} Apr 17 17:36:48.811264 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.811168 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" event={"ID":"875cebba-fbc2-4622-a3d9-91348867ab47","Type":"ContainerStarted","Data":"2d4328ddc20fcebd693b6804b3be8b0024428cc1c3b9dd5e1902b2a4d196565c"} Apr 17 17:36:48.811331 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.811281 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" Apr 17 17:36:48.817512 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.817497 2571 scope.go:117] "RemoveContainer" containerID="46bb97d8261af7dfa433a0d9c7f739d17598a19fec21f1cfa5dc2ab65bd80678" Apr 17 17:36:48.817797 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:36:48.817780 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46bb97d8261af7dfa433a0d9c7f739d17598a19fec21f1cfa5dc2ab65bd80678\": container with ID starting with 46bb97d8261af7dfa433a0d9c7f739d17598a19fec21f1cfa5dc2ab65bd80678 not found: ID does not exist" containerID="46bb97d8261af7dfa433a0d9c7f739d17598a19fec21f1cfa5dc2ab65bd80678" Apr 17 17:36:48.817852 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.817805 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46bb97d8261af7dfa433a0d9c7f739d17598a19fec21f1cfa5dc2ab65bd80678"} err="failed to get container status \"46bb97d8261af7dfa433a0d9c7f739d17598a19fec21f1cfa5dc2ab65bd80678\": rpc error: code = NotFound desc = could not find container \"46bb97d8261af7dfa433a0d9c7f739d17598a19fec21f1cfa5dc2ab65bd80678\": container with ID starting with 46bb97d8261af7dfa433a0d9c7f739d17598a19fec21f1cfa5dc2ab65bd80678 not found: ID does not exist" Apr 17 17:36:48.817852 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.817821 2571 scope.go:117] "RemoveContainer" containerID="6cccfcaee1e4516999ed7adb47a1319fd9eebfcd90c4f9185947286a99a2d281" Apr 17 17:36:48.825178 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.825161 2571 scope.go:117] "RemoveContainer" containerID="6cccfcaee1e4516999ed7adb47a1319fd9eebfcd90c4f9185947286a99a2d281" Apr 17 17:36:48.825405 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:36:48.825386 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cccfcaee1e4516999ed7adb47a1319fd9eebfcd90c4f9185947286a99a2d281\": container with ID starting with 6cccfcaee1e4516999ed7adb47a1319fd9eebfcd90c4f9185947286a99a2d281 not found: ID does not exist" containerID="6cccfcaee1e4516999ed7adb47a1319fd9eebfcd90c4f9185947286a99a2d281" Apr 17 17:36:48.825443 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.825411 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cccfcaee1e4516999ed7adb47a1319fd9eebfcd90c4f9185947286a99a2d281"} err="failed to get container status \"6cccfcaee1e4516999ed7adb47a1319fd9eebfcd90c4f9185947286a99a2d281\": rpc error: code = NotFound desc = could not find container \"6cccfcaee1e4516999ed7adb47a1319fd9eebfcd90c4f9185947286a99a2d281\": container with ID starting with 6cccfcaee1e4516999ed7adb47a1319fd9eebfcd90c4f9185947286a99a2d281 not found: ID does not exist" Apr 17 17:36:48.835141 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.835105 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z2xtr" podStartSLOduration=1.835093197 podStartE2EDuration="1.835093197s" podCreationTimestamp="2026-04-17 17:36:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:36:48.833293777 +0000 UTC m=+625.507147109" watchObservedRunningTime="2026-04-17 17:36:48.835093197 +0000 UTC m=+625.508946535" Apr 17 17:36:48.864186 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:48.864152 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" podStartSLOduration=1.864138625 podStartE2EDuration="1.864138625s" podCreationTimestamp="2026-04-17 17:36:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:36:48.86260622 +0000 UTC m=+625.536459556" watchObservedRunningTime="2026-04-17 17:36:48.864138625 +0000 UTC m=+625.537991961" Apr 17 17:36:49.916782 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:49.916752 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6018dabf-5e25-458c-bebe-95770805b360" path="/var/lib/kubelet/pods/6018dabf-5e25-458c-bebe-95770805b360/volumes" Apr 17 17:36:49.917123 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:49.917051 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8287280d-10d7-4e94-a450-2fda255ea9ee" path="/var/lib/kubelet/pods/8287280d-10d7-4e94-a450-2fda255ea9ee/volumes" Apr 17 17:36:59.819115 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:59.819082 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" Apr 17 17:36:59.819537 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:36:59.819136 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z2xtr" Apr 17 17:37:04.961580 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:04.961541 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6"] Apr 17 17:37:04.964087 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:04.961848 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" podUID="875cebba-fbc2-4622-a3d9-91348867ab47" containerName="manager" containerID="cri-o://9a8f1c8daf81bc68d023ab172c151e608ab8be1e4eb3b877e331247c97fb1b8b" gracePeriod=10 Apr 17 17:37:05.720520 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:05.720490 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" Apr 17 17:37:05.845570 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:05.845485 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jnw8\" (UniqueName: \"kubernetes.io/projected/875cebba-fbc2-4622-a3d9-91348867ab47-kube-api-access-9jnw8\") pod \"875cebba-fbc2-4622-a3d9-91348867ab47\" (UID: \"875cebba-fbc2-4622-a3d9-91348867ab47\") " Apr 17 17:37:05.845570 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:05.845549 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/875cebba-fbc2-4622-a3d9-91348867ab47-extensions-socket-volume\") pod \"875cebba-fbc2-4622-a3d9-91348867ab47\" (UID: \"875cebba-fbc2-4622-a3d9-91348867ab47\") " Apr 17 17:37:05.845955 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:05.845919 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/875cebba-fbc2-4622-a3d9-91348867ab47-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "875cebba-fbc2-4622-a3d9-91348867ab47" (UID: "875cebba-fbc2-4622-a3d9-91348867ab47"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:37:05.847567 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:05.847543 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/875cebba-fbc2-4622-a3d9-91348867ab47-kube-api-access-9jnw8" (OuterVolumeSpecName: "kube-api-access-9jnw8") pod "875cebba-fbc2-4622-a3d9-91348867ab47" (UID: "875cebba-fbc2-4622-a3d9-91348867ab47"). InnerVolumeSpecName "kube-api-access-9jnw8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:37:05.871989 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:05.871954 2571 generic.go:358] "Generic (PLEG): container finished" podID="875cebba-fbc2-4622-a3d9-91348867ab47" containerID="9a8f1c8daf81bc68d023ab172c151e608ab8be1e4eb3b877e331247c97fb1b8b" exitCode=0 Apr 17 17:37:05.872146 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:05.872022 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" Apr 17 17:37:05.872146 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:05.872038 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" event={"ID":"875cebba-fbc2-4622-a3d9-91348867ab47","Type":"ContainerDied","Data":"9a8f1c8daf81bc68d023ab172c151e608ab8be1e4eb3b877e331247c97fb1b8b"} Apr 17 17:37:05.872146 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:05.872080 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6" event={"ID":"875cebba-fbc2-4622-a3d9-91348867ab47","Type":"ContainerDied","Data":"2d4328ddc20fcebd693b6804b3be8b0024428cc1c3b9dd5e1902b2a4d196565c"} Apr 17 17:37:05.872146 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:05.872101 2571 scope.go:117] "RemoveContainer" containerID="9a8f1c8daf81bc68d023ab172c151e608ab8be1e4eb3b877e331247c97fb1b8b" Apr 17 17:37:05.880603 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:05.880585 2571 scope.go:117] "RemoveContainer" containerID="9a8f1c8daf81bc68d023ab172c151e608ab8be1e4eb3b877e331247c97fb1b8b" Apr 17 17:37:05.880919 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:37:05.880900 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a8f1c8daf81bc68d023ab172c151e608ab8be1e4eb3b877e331247c97fb1b8b\": container with ID starting with 9a8f1c8daf81bc68d023ab172c151e608ab8be1e4eb3b877e331247c97fb1b8b not found: ID does not exist" containerID="9a8f1c8daf81bc68d023ab172c151e608ab8be1e4eb3b877e331247c97fb1b8b" Apr 17 17:37:05.880965 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:05.880929 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a8f1c8daf81bc68d023ab172c151e608ab8be1e4eb3b877e331247c97fb1b8b"} err="failed to get container status \"9a8f1c8daf81bc68d023ab172c151e608ab8be1e4eb3b877e331247c97fb1b8b\": rpc error: code = NotFound desc = could not find container \"9a8f1c8daf81bc68d023ab172c151e608ab8be1e4eb3b877e331247c97fb1b8b\": container with ID starting with 9a8f1c8daf81bc68d023ab172c151e608ab8be1e4eb3b877e331247c97fb1b8b not found: ID does not exist" Apr 17 17:37:05.897139 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:05.897109 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6"] Apr 17 17:37:05.900585 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:05.900549 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d8vb6"] Apr 17 17:37:05.916643 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:05.916614 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="875cebba-fbc2-4622-a3d9-91348867ab47" path="/var/lib/kubelet/pods/875cebba-fbc2-4622-a3d9-91348867ab47/volumes" Apr 17 17:37:05.946251 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:05.946215 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9jnw8\" (UniqueName: \"kubernetes.io/projected/875cebba-fbc2-4622-a3d9-91348867ab47-kube-api-access-9jnw8\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:37:05.946251 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:05.946245 2571 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/875cebba-fbc2-4622-a3d9-91348867ab47-extensions-socket-volume\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:37:21.204176 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.204134 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x"] Apr 17 17:37:21.204819 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.204500 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="875cebba-fbc2-4622-a3d9-91348867ab47" containerName="manager" Apr 17 17:37:21.204819 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.204516 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="875cebba-fbc2-4622-a3d9-91348867ab47" containerName="manager" Apr 17 17:37:21.204819 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.204589 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="875cebba-fbc2-4622-a3d9-91348867ab47" containerName="manager" Apr 17 17:37:21.208091 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.208064 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.210899 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.210866 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-r2h8g\"" Apr 17 17:37:21.221554 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.221518 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x"] Apr 17 17:37:21.372645 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.372608 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.372645 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.372656 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.372888 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.372673 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.372888 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.372690 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.372888 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.372736 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.372888 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.372771 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.372888 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.372795 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.372888 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.372844 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.373071 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.372889 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrdj2\" (UniqueName: \"kubernetes.io/projected/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-kube-api-access-qrdj2\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.473463 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.473427 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.473657 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.473483 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrdj2\" (UniqueName: \"kubernetes.io/projected/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-kube-api-access-qrdj2\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.473657 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.473523 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.473657 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.473546 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.473657 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.473567 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.473657 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.473586 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.473657 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.473612 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.473657 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.473644 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.474070 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.473673 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.474070 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.473911 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.474070 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.474057 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.474216 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.474167 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.474216 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.474199 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.474359 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.474332 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.475865 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.475841 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.476128 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.476110 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.482053 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.482030 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.482404 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.482383 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrdj2\" (UniqueName: \"kubernetes.io/projected/30a1fd64-c3a6-42fe-a9d7-f0869357a3d0-kube-api-access-qrdj2\") pod \"maas-default-gateway-openshift-default-58b6f876-p962x\" (UID: \"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.520291 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.520245 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:21.654675 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.654650 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x"] Apr 17 17:37:21.657553 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:37:21.657526 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30a1fd64_c3a6_42fe_a9d7_f0869357a3d0.slice/crio-8e6dd8719203a552a8c5459327259235c810a4a44bd683239bee32944711fefd WatchSource:0}: Error finding container 8e6dd8719203a552a8c5459327259235c810a4a44bd683239bee32944711fefd: Status 404 returned error can't find the container with id 8e6dd8719203a552a8c5459327259235c810a4a44bd683239bee32944711fefd Apr 17 17:37:21.659971 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.659946 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:37:21.660456 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.660424 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 17 17:37:21.660519 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.660503 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 17 17:37:21.660560 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.660547 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 17 17:37:21.928770 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.928659 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" event={"ID":"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0","Type":"ContainerStarted","Data":"32f7f3420e6b6f13972d54ce44bd155c981edbf6724869402ca49efdf00b308d"} Apr 17 17:37:21.928770 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.928719 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" event={"ID":"30a1fd64-c3a6-42fe-a9d7-f0869357a3d0","Type":"ContainerStarted","Data":"8e6dd8719203a552a8c5459327259235c810a4a44bd683239bee32944711fefd"} Apr 17 17:37:21.949969 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:21.949916 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" podStartSLOduration=0.949901065 podStartE2EDuration="949.901065ms" podCreationTimestamp="2026-04-17 17:37:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:37:21.948057676 +0000 UTC m=+658.621911041" watchObservedRunningTime="2026-04-17 17:37:21.949901065 +0000 UTC m=+658.623754401" Apr 17 17:37:22.520810 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:22.520771 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:22.525764 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:22.525736 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:22.932487 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:22.932397 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:22.933677 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:22.933652 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-p962x" Apr 17 17:37:26.904581 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:26.904547 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-dxmdx"] Apr 17 17:37:26.909280 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:26.909260 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" Apr 17 17:37:26.911813 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:26.911785 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-l59xh\"" Apr 17 17:37:26.911960 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:26.911785 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 17:37:26.917746 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:26.917723 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-dxmdx"] Apr 17 17:37:26.999944 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:26.999905 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-dxmdx"] Apr 17 17:37:27.026792 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.026755 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2tsf\" (UniqueName: \"kubernetes.io/projected/203f95b5-fde6-4632-800d-d9e834d871b0-kube-api-access-h2tsf\") pod \"limitador-limitador-7d549b5b-dxmdx\" (UID: \"203f95b5-fde6-4632-800d-d9e834d871b0\") " pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" Apr 17 17:37:27.026949 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.026912 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/203f95b5-fde6-4632-800d-d9e834d871b0-config-file\") pod \"limitador-limitador-7d549b5b-dxmdx\" (UID: \"203f95b5-fde6-4632-800d-d9e834d871b0\") " pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" Apr 17 17:37:27.127646 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.127607 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/203f95b5-fde6-4632-800d-d9e834d871b0-config-file\") pod \"limitador-limitador-7d549b5b-dxmdx\" (UID: \"203f95b5-fde6-4632-800d-d9e834d871b0\") " pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" Apr 17 17:37:27.127883 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.127672 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2tsf\" (UniqueName: \"kubernetes.io/projected/203f95b5-fde6-4632-800d-d9e834d871b0-kube-api-access-h2tsf\") pod \"limitador-limitador-7d549b5b-dxmdx\" (UID: \"203f95b5-fde6-4632-800d-d9e834d871b0\") " pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" Apr 17 17:37:27.128395 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.128375 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/203f95b5-fde6-4632-800d-d9e834d871b0-config-file\") pod \"limitador-limitador-7d549b5b-dxmdx\" (UID: \"203f95b5-fde6-4632-800d-d9e834d871b0\") " pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" Apr 17 17:37:27.136168 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.136148 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2tsf\" (UniqueName: \"kubernetes.io/projected/203f95b5-fde6-4632-800d-d9e834d871b0-kube-api-access-h2tsf\") pod \"limitador-limitador-7d549b5b-dxmdx\" (UID: \"203f95b5-fde6-4632-800d-d9e834d871b0\") " pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" Apr 17 17:37:27.220057 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.220027 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" Apr 17 17:37:27.348987 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.348836 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-dxmdx"] Apr 17 17:37:27.351829 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:37:27.351799 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod203f95b5_fde6_4632_800d_d9e834d871b0.slice/crio-16dbc36dfd08b1ff799aa6e8349149e9f9af4cc1df79e59aecf235f393d36524 WatchSource:0}: Error finding container 16dbc36dfd08b1ff799aa6e8349149e9f9af4cc1df79e59aecf235f393d36524: Status 404 returned error can't find the container with id 16dbc36dfd08b1ff799aa6e8349149e9f9af4cc1df79e59aecf235f393d36524 Apr 17 17:37:27.700816 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.700730 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9r5th"] Apr 17 17:37:27.706144 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.706117 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-9r5th" Apr 17 17:37:27.708998 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.708779 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-6r7s8\"" Apr 17 17:37:27.711291 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.711215 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9r5th"] Apr 17 17:37:27.836484 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.836442 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm5j8\" (UniqueName: \"kubernetes.io/projected/ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3-kube-api-access-tm5j8\") pod \"authorino-f99f4b5cd-9r5th\" (UID: \"ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3\") " pod="kuadrant-system/authorino-f99f4b5cd-9r5th" Apr 17 17:37:27.856562 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.856534 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-bjg8q"] Apr 17 17:37:27.859901 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.859885 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-bjg8q" Apr 17 17:37:27.865503 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.865471 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-bjg8q"] Apr 17 17:37:27.937106 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.937078 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tm5j8\" (UniqueName: \"kubernetes.io/projected/ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3-kube-api-access-tm5j8\") pod \"authorino-f99f4b5cd-9r5th\" (UID: \"ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3\") " pod="kuadrant-system/authorino-f99f4b5cd-9r5th" Apr 17 17:37:27.945643 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.945618 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm5j8\" (UniqueName: \"kubernetes.io/projected/ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3-kube-api-access-tm5j8\") pod \"authorino-f99f4b5cd-9r5th\" (UID: \"ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3\") " pod="kuadrant-system/authorino-f99f4b5cd-9r5th" Apr 17 17:37:27.950339 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:27.950311 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" event={"ID":"203f95b5-fde6-4632-800d-d9e834d871b0","Type":"ContainerStarted","Data":"16dbc36dfd08b1ff799aa6e8349149e9f9af4cc1df79e59aecf235f393d36524"} Apr 17 17:37:28.017578 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:28.017547 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-9r5th" Apr 17 17:37:28.037648 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:28.037611 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b968k\" (UniqueName: \"kubernetes.io/projected/c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202-kube-api-access-b968k\") pod \"authorino-7498df8756-bjg8q\" (UID: \"c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202\") " pod="kuadrant-system/authorino-7498df8756-bjg8q" Apr 17 17:37:28.142797 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:28.141024 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b968k\" (UniqueName: \"kubernetes.io/projected/c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202-kube-api-access-b968k\") pod \"authorino-7498df8756-bjg8q\" (UID: \"c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202\") " pod="kuadrant-system/authorino-7498df8756-bjg8q" Apr 17 17:37:28.157042 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:28.157006 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b968k\" (UniqueName: \"kubernetes.io/projected/c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202-kube-api-access-b968k\") pod \"authorino-7498df8756-bjg8q\" (UID: \"c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202\") " pod="kuadrant-system/authorino-7498df8756-bjg8q" Apr 17 17:37:28.169543 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:28.169494 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-bjg8q" Apr 17 17:37:28.187459 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:28.187406 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9r5th"] Apr 17 17:37:28.344503 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:28.344445 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-bjg8q"] Apr 17 17:37:28.350337 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:37:28.349170 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc54f45f0_bcb8_41cd_8fc3_a14cc2c3f202.slice/crio-8b5aea8ae92a842bbb01eeaa7800d4c4e80ccb49e1fb151ab87289f651bfb10d WatchSource:0}: Error finding container 8b5aea8ae92a842bbb01eeaa7800d4c4e80ccb49e1fb151ab87289f651bfb10d: Status 404 returned error can't find the container with id 8b5aea8ae92a842bbb01eeaa7800d4c4e80ccb49e1fb151ab87289f651bfb10d Apr 17 17:37:28.955921 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:28.955850 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-bjg8q" event={"ID":"c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202","Type":"ContainerStarted","Data":"8b5aea8ae92a842bbb01eeaa7800d4c4e80ccb49e1fb151ab87289f651bfb10d"} Apr 17 17:37:28.958874 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:28.958794 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-9r5th" event={"ID":"ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3","Type":"ContainerStarted","Data":"c51f2fc8e20bf9659e81807776c68271342a097cc42ae25ae9513f5b9ac790b1"} Apr 17 17:37:31.971409 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:31.971369 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" event={"ID":"203f95b5-fde6-4632-800d-d9e834d871b0","Type":"ContainerStarted","Data":"b47054027a3281cb82b9cecaf89df5af8083fef65d884c5ddcebf1b9b7331cd0"} Apr 17 17:37:31.971863 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:31.971446 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" Apr 17 17:37:31.972694 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:31.972663 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-bjg8q" event={"ID":"c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202","Type":"ContainerStarted","Data":"1f0af88dad266e5314f088b520e38440711a361fd78bd2d32d7810a52743c6e8"} Apr 17 17:37:31.973932 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:31.973911 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-9r5th" event={"ID":"ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3","Type":"ContainerStarted","Data":"1f1979985edd90cd2f3efc3d79b54f79d47e349db1342dc289a94bbd3257326a"} Apr 17 17:37:32.005275 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:32.005220 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-bjg8q" podStartSLOduration=1.929833757 podStartE2EDuration="5.005204472s" podCreationTimestamp="2026-04-17 17:37:27 +0000 UTC" firstStartedPulling="2026-04-17 17:37:28.35114402 +0000 UTC m=+665.024997340" lastFinishedPulling="2026-04-17 17:37:31.426514738 +0000 UTC m=+668.100368055" observedRunningTime="2026-04-17 17:37:32.004678725 +0000 UTC m=+668.678532061" watchObservedRunningTime="2026-04-17 17:37:32.005204472 +0000 UTC m=+668.679057832" Apr 17 17:37:32.005730 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:32.005687 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" podStartSLOduration=1.8790712790000001 podStartE2EDuration="6.005678432s" podCreationTimestamp="2026-04-17 17:37:26 +0000 UTC" firstStartedPulling="2026-04-17 17:37:27.353687391 +0000 UTC m=+664.027540705" lastFinishedPulling="2026-04-17 17:37:31.480294527 +0000 UTC m=+668.154147858" observedRunningTime="2026-04-17 17:37:31.989555991 +0000 UTC m=+668.663409329" watchObservedRunningTime="2026-04-17 17:37:32.005678432 +0000 UTC m=+668.679531823" Apr 17 17:37:32.020182 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:32.020137 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-9r5th" podStartSLOduration=1.776935086 podStartE2EDuration="5.020120848s" podCreationTimestamp="2026-04-17 17:37:27 +0000 UTC" firstStartedPulling="2026-04-17 17:37:28.194235787 +0000 UTC m=+664.868089118" lastFinishedPulling="2026-04-17 17:37:31.437421567 +0000 UTC m=+668.111274880" observedRunningTime="2026-04-17 17:37:32.01864325 +0000 UTC m=+668.692496587" watchObservedRunningTime="2026-04-17 17:37:32.020120848 +0000 UTC m=+668.693974184" Apr 17 17:37:32.055923 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:32.055894 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9r5th"] Apr 17 17:37:33.980859 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:33.980819 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-9r5th" podUID="ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3" containerName="authorino" containerID="cri-o://1f1979985edd90cd2f3efc3d79b54f79d47e349db1342dc289a94bbd3257326a" gracePeriod=30 Apr 17 17:37:34.239476 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:34.239407 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-9r5th" Apr 17 17:37:34.301082 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:34.301038 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm5j8\" (UniqueName: \"kubernetes.io/projected/ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3-kube-api-access-tm5j8\") pod \"ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3\" (UID: \"ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3\") " Apr 17 17:37:34.303150 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:34.303117 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3-kube-api-access-tm5j8" (OuterVolumeSpecName: "kube-api-access-tm5j8") pod "ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3" (UID: "ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3"). InnerVolumeSpecName "kube-api-access-tm5j8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:37:34.401903 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:34.401872 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tm5j8\" (UniqueName: \"kubernetes.io/projected/ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3-kube-api-access-tm5j8\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:37:34.986214 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:34.986177 2571 generic.go:358] "Generic (PLEG): container finished" podID="ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3" containerID="1f1979985edd90cd2f3efc3d79b54f79d47e349db1342dc289a94bbd3257326a" exitCode=0 Apr 17 17:37:34.986607 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:34.986228 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-9r5th" Apr 17 17:37:34.986607 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:34.986274 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-9r5th" event={"ID":"ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3","Type":"ContainerDied","Data":"1f1979985edd90cd2f3efc3d79b54f79d47e349db1342dc289a94bbd3257326a"} Apr 17 17:37:34.986607 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:34.986312 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-9r5th" event={"ID":"ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3","Type":"ContainerDied","Data":"c51f2fc8e20bf9659e81807776c68271342a097cc42ae25ae9513f5b9ac790b1"} Apr 17 17:37:34.986607 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:34.986329 2571 scope.go:117] "RemoveContainer" containerID="1f1979985edd90cd2f3efc3d79b54f79d47e349db1342dc289a94bbd3257326a" Apr 17 17:37:34.995230 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:34.995212 2571 scope.go:117] "RemoveContainer" containerID="1f1979985edd90cd2f3efc3d79b54f79d47e349db1342dc289a94bbd3257326a" Apr 17 17:37:34.995527 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:37:34.995507 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f1979985edd90cd2f3efc3d79b54f79d47e349db1342dc289a94bbd3257326a\": container with ID starting with 1f1979985edd90cd2f3efc3d79b54f79d47e349db1342dc289a94bbd3257326a not found: ID does not exist" containerID="1f1979985edd90cd2f3efc3d79b54f79d47e349db1342dc289a94bbd3257326a" Apr 17 17:37:34.995571 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:34.995538 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f1979985edd90cd2f3efc3d79b54f79d47e349db1342dc289a94bbd3257326a"} err="failed to get container status \"1f1979985edd90cd2f3efc3d79b54f79d47e349db1342dc289a94bbd3257326a\": rpc error: code = NotFound desc = could not find container \"1f1979985edd90cd2f3efc3d79b54f79d47e349db1342dc289a94bbd3257326a\": container with ID starting with 1f1979985edd90cd2f3efc3d79b54f79d47e349db1342dc289a94bbd3257326a not found: ID does not exist" Apr 17 17:37:35.007649 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:35.007620 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9r5th"] Apr 17 17:37:35.011092 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:35.011060 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9r5th"] Apr 17 17:37:35.917185 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:35.917152 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3" path="/var/lib/kubelet/pods/ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3/volumes" Apr 17 17:37:41.576290 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:41.576254 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-dxmdx"] Apr 17 17:37:41.576727 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:41.576517 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" podUID="203f95b5-fde6-4632-800d-d9e834d871b0" containerName="limitador" containerID="cri-o://b47054027a3281cb82b9cecaf89df5af8083fef65d884c5ddcebf1b9b7331cd0" gracePeriod=30 Apr 17 17:37:41.577319 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:41.577300 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" Apr 17 17:37:42.014782 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:42.014748 2571 generic.go:358] "Generic (PLEG): container finished" podID="203f95b5-fde6-4632-800d-d9e834d871b0" containerID="b47054027a3281cb82b9cecaf89df5af8083fef65d884c5ddcebf1b9b7331cd0" exitCode=0 Apr 17 17:37:42.014935 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:42.014817 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" event={"ID":"203f95b5-fde6-4632-800d-d9e834d871b0","Type":"ContainerDied","Data":"b47054027a3281cb82b9cecaf89df5af8083fef65d884c5ddcebf1b9b7331cd0"} Apr 17 17:37:42.123336 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:42.123315 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" Apr 17 17:37:42.169093 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:42.169013 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/203f95b5-fde6-4632-800d-d9e834d871b0-config-file\") pod \"203f95b5-fde6-4632-800d-d9e834d871b0\" (UID: \"203f95b5-fde6-4632-800d-d9e834d871b0\") " Apr 17 17:37:42.169093 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:42.169078 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2tsf\" (UniqueName: \"kubernetes.io/projected/203f95b5-fde6-4632-800d-d9e834d871b0-kube-api-access-h2tsf\") pod \"203f95b5-fde6-4632-800d-d9e834d871b0\" (UID: \"203f95b5-fde6-4632-800d-d9e834d871b0\") " Apr 17 17:37:42.169390 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:42.169367 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/203f95b5-fde6-4632-800d-d9e834d871b0-config-file" (OuterVolumeSpecName: "config-file") pod "203f95b5-fde6-4632-800d-d9e834d871b0" (UID: "203f95b5-fde6-4632-800d-d9e834d871b0"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:37:42.171134 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:42.171100 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/203f95b5-fde6-4632-800d-d9e834d871b0-kube-api-access-h2tsf" (OuterVolumeSpecName: "kube-api-access-h2tsf") pod "203f95b5-fde6-4632-800d-d9e834d871b0" (UID: "203f95b5-fde6-4632-800d-d9e834d871b0"). InnerVolumeSpecName "kube-api-access-h2tsf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:37:42.270324 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:42.270288 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h2tsf\" (UniqueName: \"kubernetes.io/projected/203f95b5-fde6-4632-800d-d9e834d871b0-kube-api-access-h2tsf\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:37:42.270324 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:42.270318 2571 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/203f95b5-fde6-4632-800d-d9e834d871b0-config-file\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:37:43.019712 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:43.019665 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" Apr 17 17:37:43.020182 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:43.019692 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-dxmdx" event={"ID":"203f95b5-fde6-4632-800d-d9e834d871b0","Type":"ContainerDied","Data":"16dbc36dfd08b1ff799aa6e8349149e9f9af4cc1df79e59aecf235f393d36524"} Apr 17 17:37:43.020182 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:43.019757 2571 scope.go:117] "RemoveContainer" containerID="b47054027a3281cb82b9cecaf89df5af8083fef65d884c5ddcebf1b9b7331cd0" Apr 17 17:37:43.042054 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:43.042020 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-dxmdx"] Apr 17 17:37:43.050608 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:43.050577 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-dxmdx"] Apr 17 17:37:43.917123 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:43.917089 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="203f95b5-fde6-4632-800d-d9e834d871b0" path="/var/lib/kubelet/pods/203f95b5-fde6-4632-800d-d9e834d871b0/volumes" Apr 17 17:37:46.944325 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:46.944293 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-j4tff"] Apr 17 17:37:46.944884 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:46.944867 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3" containerName="authorino" Apr 17 17:37:46.944930 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:46.944889 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3" containerName="authorino" Apr 17 17:37:46.944968 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:46.944929 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="203f95b5-fde6-4632-800d-d9e834d871b0" containerName="limitador" Apr 17 17:37:46.944968 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:46.944938 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="203f95b5-fde6-4632-800d-d9e834d871b0" containerName="limitador" Apr 17 17:37:46.945028 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:46.945017 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="203f95b5-fde6-4632-800d-d9e834d871b0" containerName="limitador" Apr 17 17:37:46.945058 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:46.945031 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce11d214-3d34-4a5b-9e4a-c1d79d67d3e3" containerName="authorino" Apr 17 17:37:46.949626 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:46.949600 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-j4tff" Apr 17 17:37:46.952827 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:46.952804 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 17 17:37:46.953039 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:46.953025 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-9n88n\"" Apr 17 17:37:46.966505 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:46.966475 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-j4tff"] Apr 17 17:37:47.011498 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:47.011458 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9f9e55fa-b286-428a-a040-5ea7dd918f33-data\") pod \"postgres-868db5846d-j4tff\" (UID: \"9f9e55fa-b286-428a-a040-5ea7dd918f33\") " pod="opendatahub/postgres-868db5846d-j4tff" Apr 17 17:37:47.011686 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:47.011526 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhwbl\" (UniqueName: \"kubernetes.io/projected/9f9e55fa-b286-428a-a040-5ea7dd918f33-kube-api-access-qhwbl\") pod \"postgres-868db5846d-j4tff\" (UID: \"9f9e55fa-b286-428a-a040-5ea7dd918f33\") " pod="opendatahub/postgres-868db5846d-j4tff" Apr 17 17:37:47.112672 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:47.112629 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9f9e55fa-b286-428a-a040-5ea7dd918f33-data\") pod \"postgres-868db5846d-j4tff\" (UID: \"9f9e55fa-b286-428a-a040-5ea7dd918f33\") " pod="opendatahub/postgres-868db5846d-j4tff" Apr 17 17:37:47.112853 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:47.112689 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhwbl\" (UniqueName: \"kubernetes.io/projected/9f9e55fa-b286-428a-a040-5ea7dd918f33-kube-api-access-qhwbl\") pod \"postgres-868db5846d-j4tff\" (UID: \"9f9e55fa-b286-428a-a040-5ea7dd918f33\") " pod="opendatahub/postgres-868db5846d-j4tff" Apr 17 17:37:47.112995 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:47.112975 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9f9e55fa-b286-428a-a040-5ea7dd918f33-data\") pod \"postgres-868db5846d-j4tff\" (UID: \"9f9e55fa-b286-428a-a040-5ea7dd918f33\") " pod="opendatahub/postgres-868db5846d-j4tff" Apr 17 17:37:47.123261 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:47.123224 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhwbl\" (UniqueName: \"kubernetes.io/projected/9f9e55fa-b286-428a-a040-5ea7dd918f33-kube-api-access-qhwbl\") pod \"postgres-868db5846d-j4tff\" (UID: \"9f9e55fa-b286-428a-a040-5ea7dd918f33\") " pod="opendatahub/postgres-868db5846d-j4tff" Apr 17 17:37:47.259892 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:47.259852 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-j4tff" Apr 17 17:37:47.386629 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:47.386606 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-j4tff"] Apr 17 17:37:47.388826 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:37:47.388791 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f9e55fa_b286_428a_a040_5ea7dd918f33.slice/crio-b2f9254828a8035071ab283b33afaa0e9ce1c95d404475de3707176a778c80c5 WatchSource:0}: Error finding container b2f9254828a8035071ab283b33afaa0e9ce1c95d404475de3707176a778c80c5: Status 404 returned error can't find the container with id b2f9254828a8035071ab283b33afaa0e9ce1c95d404475de3707176a778c80c5 Apr 17 17:37:48.038318 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:48.038281 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-j4tff" event={"ID":"9f9e55fa-b286-428a-a040-5ea7dd918f33","Type":"ContainerStarted","Data":"b2f9254828a8035071ab283b33afaa0e9ce1c95d404475de3707176a778c80c5"} Apr 17 17:37:54.063215 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:54.063178 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-j4tff" event={"ID":"9f9e55fa-b286-428a-a040-5ea7dd918f33","Type":"ContainerStarted","Data":"9b210371e04c029515e0fce605006651953de3283088bccd3cf03e8f9434b87d"} Apr 17 17:37:54.063742 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:54.063289 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-j4tff" Apr 17 17:37:54.080522 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:37:54.080470 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-j4tff" podStartSLOduration=2.468316451 podStartE2EDuration="8.080454787s" podCreationTimestamp="2026-04-17 17:37:46 +0000 UTC" firstStartedPulling="2026-04-17 17:37:47.390091682 +0000 UTC m=+684.063944996" lastFinishedPulling="2026-04-17 17:37:53.002230015 +0000 UTC m=+689.676083332" observedRunningTime="2026-04-17 17:37:54.079281715 +0000 UTC m=+690.753135052" watchObservedRunningTime="2026-04-17 17:37:54.080454787 +0000 UTC m=+690.754308123" Apr 17 17:38:00.095149 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:00.095122 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-j4tff" Apr 17 17:38:09.278849 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:09.278765 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-lq2ns"] Apr 17 17:38:09.282318 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:09.282296 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-lq2ns" Apr 17 17:38:09.285403 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:09.285385 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 17 17:38:09.285567 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:09.285547 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 17 17:38:09.286533 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:09.286518 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"keycloak-operator-dockercfg-fthdr\"" Apr 17 17:38:09.291279 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:09.291259 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-lq2ns"] Apr 17 17:38:09.412953 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:09.412916 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnvmb\" (UniqueName: \"kubernetes.io/projected/07e69a5a-c882-49c1-b0c6-0ce1683d55a8-kube-api-access-lnvmb\") pod \"keycloak-operator-5c4df598dd-lq2ns\" (UID: \"07e69a5a-c882-49c1-b0c6-0ce1683d55a8\") " pod="keycloak-system/keycloak-operator-5c4df598dd-lq2ns" Apr 17 17:38:09.513404 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:09.513371 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnvmb\" (UniqueName: \"kubernetes.io/projected/07e69a5a-c882-49c1-b0c6-0ce1683d55a8-kube-api-access-lnvmb\") pod \"keycloak-operator-5c4df598dd-lq2ns\" (UID: \"07e69a5a-c882-49c1-b0c6-0ce1683d55a8\") " pod="keycloak-system/keycloak-operator-5c4df598dd-lq2ns" Apr 17 17:38:09.523483 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:09.523447 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnvmb\" (UniqueName: \"kubernetes.io/projected/07e69a5a-c882-49c1-b0c6-0ce1683d55a8-kube-api-access-lnvmb\") pod \"keycloak-operator-5c4df598dd-lq2ns\" (UID: \"07e69a5a-c882-49c1-b0c6-0ce1683d55a8\") " pod="keycloak-system/keycloak-operator-5c4df598dd-lq2ns" Apr 17 17:38:09.594070 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:09.593983 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-lq2ns" Apr 17 17:38:09.719807 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:09.717311 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-lq2ns"] Apr 17 17:38:10.117556 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:10.117518 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-lq2ns" event={"ID":"07e69a5a-c882-49c1-b0c6-0ce1683d55a8","Type":"ContainerStarted","Data":"5cda368ba5532d5cc9c20279ffd3b0255e6a5aca3b47fe57b575dec9030b6f82"} Apr 17 17:38:16.139975 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:16.139934 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-lq2ns" event={"ID":"07e69a5a-c882-49c1-b0c6-0ce1683d55a8","Type":"ContainerStarted","Data":"df98917797f1cc67cabf7d67ae6f9068c8d71eda2b747f170f3cc643e673b0e9"} Apr 17 17:38:16.167112 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:16.167063 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/keycloak-operator-5c4df598dd-lq2ns" podStartSLOduration=1.538274361 podStartE2EDuration="7.167049255s" podCreationTimestamp="2026-04-17 17:38:09 +0000 UTC" firstStartedPulling="2026-04-17 17:38:09.722532424 +0000 UTC m=+706.396385738" lastFinishedPulling="2026-04-17 17:38:15.351307316 +0000 UTC m=+712.025160632" observedRunningTime="2026-04-17 17:38:16.165836209 +0000 UTC m=+712.839689544" watchObservedRunningTime="2026-04-17 17:38:16.167049255 +0000 UTC m=+712.840902668" Apr 17 17:38:55.858265 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:55.858227 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-ks2c9"] Apr 17 17:38:55.869263 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:55.869237 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-ks2c9"] Apr 17 17:38:55.869384 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:55.869327 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-ks2c9" Apr 17 17:38:55.979879 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:55.979851 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rjvs\" (UniqueName: \"kubernetes.io/projected/f7dbf07b-d429-4de4-8339-5f95063c1fb0-kube-api-access-8rjvs\") pod \"authorino-8b475cf9f-ks2c9\" (UID: \"f7dbf07b-d429-4de4-8339-5f95063c1fb0\") " pod="kuadrant-system/authorino-8b475cf9f-ks2c9" Apr 17 17:38:56.080613 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.080584 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rjvs\" (UniqueName: \"kubernetes.io/projected/f7dbf07b-d429-4de4-8339-5f95063c1fb0-kube-api-access-8rjvs\") pod \"authorino-8b475cf9f-ks2c9\" (UID: \"f7dbf07b-d429-4de4-8339-5f95063c1fb0\") " pod="kuadrant-system/authorino-8b475cf9f-ks2c9" Apr 17 17:38:56.089879 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.089859 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rjvs\" (UniqueName: \"kubernetes.io/projected/f7dbf07b-d429-4de4-8339-5f95063c1fb0-kube-api-access-8rjvs\") pod \"authorino-8b475cf9f-ks2c9\" (UID: \"f7dbf07b-d429-4de4-8339-5f95063c1fb0\") " pod="kuadrant-system/authorino-8b475cf9f-ks2c9" Apr 17 17:38:56.101662 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.101638 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-ks2c9"] Apr 17 17:38:56.101837 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.101824 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-ks2c9" Apr 17 17:38:56.132962 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.132935 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7f4ccd95db-xr65j"] Apr 17 17:38:56.138345 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.138322 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7f4ccd95db-xr65j" Apr 17 17:38:56.142742 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.142694 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7f4ccd95db-xr65j"] Apr 17 17:38:56.240738 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.239724 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-ks2c9"] Apr 17 17:38:56.243836 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:38:56.243809 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7dbf07b_d429_4de4_8339_5f95063c1fb0.slice/crio-1a298273697667650d531aaef9d9a2ea30f32c5fd84653573fe3ae8769a94a16 WatchSource:0}: Error finding container 1a298273697667650d531aaef9d9a2ea30f32c5fd84653573fe3ae8769a94a16: Status 404 returned error can't find the container with id 1a298273697667650d531aaef9d9a2ea30f32c5fd84653573fe3ae8769a94a16 Apr 17 17:38:56.271685 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.271655 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-ks2c9" event={"ID":"f7dbf07b-d429-4de4-8339-5f95063c1fb0","Type":"ContainerStarted","Data":"1a298273697667650d531aaef9d9a2ea30f32c5fd84653573fe3ae8769a94a16"} Apr 17 17:38:56.291741 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.291198 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87k45\" (UniqueName: \"kubernetes.io/projected/b1de2830-7657-49ab-acf9-9b6257eacd7f-kube-api-access-87k45\") pod \"authorino-7f4ccd95db-xr65j\" (UID: \"b1de2830-7657-49ab-acf9-9b6257eacd7f\") " pod="kuadrant-system/authorino-7f4ccd95db-xr65j" Apr 17 17:38:56.369457 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.369376 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7f4ccd95db-xr65j"] Apr 17 17:38:56.369647 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:38:56.369629 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-87k45], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-7f4ccd95db-xr65j" podUID="b1de2830-7657-49ab-acf9-9b6257eacd7f" Apr 17 17:38:56.392622 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.392579 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87k45\" (UniqueName: \"kubernetes.io/projected/b1de2830-7657-49ab-acf9-9b6257eacd7f-kube-api-access-87k45\") pod \"authorino-7f4ccd95db-xr65j\" (UID: \"b1de2830-7657-49ab-acf9-9b6257eacd7f\") " pod="kuadrant-system/authorino-7f4ccd95db-xr65j" Apr 17 17:38:56.399813 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.399776 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-b8c556d58-8nt7f"] Apr 17 17:38:56.403514 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.403478 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87k45\" (UniqueName: \"kubernetes.io/projected/b1de2830-7657-49ab-acf9-9b6257eacd7f-kube-api-access-87k45\") pod \"authorino-7f4ccd95db-xr65j\" (UID: \"b1de2830-7657-49ab-acf9-9b6257eacd7f\") " pod="kuadrant-system/authorino-7f4ccd95db-xr65j" Apr 17 17:38:56.405152 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.405130 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-b8c556d58-8nt7f" Apr 17 17:38:56.408628 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.407971 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 17:38:56.409566 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.409548 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-b8c556d58-8nt7f"] Apr 17 17:38:56.493965 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.493774 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/5276b25c-199a-4e78-b79d-b5991cc0a565-tls-cert\") pod \"authorino-b8c556d58-8nt7f\" (UID: \"5276b25c-199a-4e78-b79d-b5991cc0a565\") " pod="kuadrant-system/authorino-b8c556d58-8nt7f" Apr 17 17:38:56.493965 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.493888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf8f7\" (UniqueName: \"kubernetes.io/projected/5276b25c-199a-4e78-b79d-b5991cc0a565-kube-api-access-jf8f7\") pod \"authorino-b8c556d58-8nt7f\" (UID: \"5276b25c-199a-4e78-b79d-b5991cc0a565\") " pod="kuadrant-system/authorino-b8c556d58-8nt7f" Apr 17 17:38:56.594887 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.594852 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jf8f7\" (UniqueName: \"kubernetes.io/projected/5276b25c-199a-4e78-b79d-b5991cc0a565-kube-api-access-jf8f7\") pod \"authorino-b8c556d58-8nt7f\" (UID: \"5276b25c-199a-4e78-b79d-b5991cc0a565\") " pod="kuadrant-system/authorino-b8c556d58-8nt7f" Apr 17 17:38:56.595491 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.595469 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/5276b25c-199a-4e78-b79d-b5991cc0a565-tls-cert\") pod \"authorino-b8c556d58-8nt7f\" (UID: \"5276b25c-199a-4e78-b79d-b5991cc0a565\") " pod="kuadrant-system/authorino-b8c556d58-8nt7f" Apr 17 17:38:56.598824 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.598782 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/5276b25c-199a-4e78-b79d-b5991cc0a565-tls-cert\") pod \"authorino-b8c556d58-8nt7f\" (UID: \"5276b25c-199a-4e78-b79d-b5991cc0a565\") " pod="kuadrant-system/authorino-b8c556d58-8nt7f" Apr 17 17:38:56.604420 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.604397 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf8f7\" (UniqueName: \"kubernetes.io/projected/5276b25c-199a-4e78-b79d-b5991cc0a565-kube-api-access-jf8f7\") pod \"authorino-b8c556d58-8nt7f\" (UID: \"5276b25c-199a-4e78-b79d-b5991cc0a565\") " pod="kuadrant-system/authorino-b8c556d58-8nt7f" Apr 17 17:38:56.719881 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.719858 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-b8c556d58-8nt7f" Apr 17 17:38:56.874218 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:56.874193 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-b8c556d58-8nt7f"] Apr 17 17:38:56.875945 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:38:56.875920 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5276b25c_199a_4e78_b79d_b5991cc0a565.slice/crio-f5488c197cce253691fa66a09d18d92b8fb57e3b0481afe6d7cb0901f299ed12 WatchSource:0}: Error finding container f5488c197cce253691fa66a09d18d92b8fb57e3b0481afe6d7cb0901f299ed12: Status 404 returned error can't find the container with id f5488c197cce253691fa66a09d18d92b8fb57e3b0481afe6d7cb0901f299ed12 Apr 17 17:38:57.276640 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:57.276607 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-b8c556d58-8nt7f" event={"ID":"5276b25c-199a-4e78-b79d-b5991cc0a565","Type":"ContainerStarted","Data":"f5488c197cce253691fa66a09d18d92b8fb57e3b0481afe6d7cb0901f299ed12"} Apr 17 17:38:57.277922 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:57.277894 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-ks2c9" event={"ID":"f7dbf07b-d429-4de4-8339-5f95063c1fb0","Type":"ContainerStarted","Data":"800b1c51fa148844a257d0b478fa0005f2de9655d5d866b2c34ecafb87ad324c"} Apr 17 17:38:57.277922 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:57.277916 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7f4ccd95db-xr65j" Apr 17 17:38:57.278083 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:57.277986 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-ks2c9" podUID="f7dbf07b-d429-4de4-8339-5f95063c1fb0" containerName="authorino" containerID="cri-o://800b1c51fa148844a257d0b478fa0005f2de9655d5d866b2c34ecafb87ad324c" gracePeriod=30 Apr 17 17:38:57.398229 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:57.398194 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7f4ccd95db-xr65j" Apr 17 17:38:57.509863 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:57.509780 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87k45\" (UniqueName: \"kubernetes.io/projected/b1de2830-7657-49ab-acf9-9b6257eacd7f-kube-api-access-87k45\") pod \"b1de2830-7657-49ab-acf9-9b6257eacd7f\" (UID: \"b1de2830-7657-49ab-acf9-9b6257eacd7f\") " Apr 17 17:38:57.511868 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:57.511841 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1de2830-7657-49ab-acf9-9b6257eacd7f-kube-api-access-87k45" (OuterVolumeSpecName: "kube-api-access-87k45") pod "b1de2830-7657-49ab-acf9-9b6257eacd7f" (UID: "b1de2830-7657-49ab-acf9-9b6257eacd7f"). InnerVolumeSpecName "kube-api-access-87k45". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:38:57.545623 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:57.545602 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-ks2c9" Apr 17 17:38:57.610996 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:57.610963 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-87k45\" (UniqueName: \"kubernetes.io/projected/b1de2830-7657-49ab-acf9-9b6257eacd7f-kube-api-access-87k45\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:38:57.711310 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:57.711284 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rjvs\" (UniqueName: \"kubernetes.io/projected/f7dbf07b-d429-4de4-8339-5f95063c1fb0-kube-api-access-8rjvs\") pod \"f7dbf07b-d429-4de4-8339-5f95063c1fb0\" (UID: \"f7dbf07b-d429-4de4-8339-5f95063c1fb0\") " Apr 17 17:38:57.713214 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:57.713186 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7dbf07b-d429-4de4-8339-5f95063c1fb0-kube-api-access-8rjvs" (OuterVolumeSpecName: "kube-api-access-8rjvs") pod "f7dbf07b-d429-4de4-8339-5f95063c1fb0" (UID: "f7dbf07b-d429-4de4-8339-5f95063c1fb0"). InnerVolumeSpecName "kube-api-access-8rjvs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:38:57.813004 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:57.812914 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8rjvs\" (UniqueName: \"kubernetes.io/projected/f7dbf07b-d429-4de4-8339-5f95063c1fb0-kube-api-access-8rjvs\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:38:58.283422 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.283384 2571 generic.go:358] "Generic (PLEG): container finished" podID="f7dbf07b-d429-4de4-8339-5f95063c1fb0" containerID="800b1c51fa148844a257d0b478fa0005f2de9655d5d866b2c34ecafb87ad324c" exitCode=0 Apr 17 17:38:58.283873 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.283536 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-ks2c9" event={"ID":"f7dbf07b-d429-4de4-8339-5f95063c1fb0","Type":"ContainerDied","Data":"800b1c51fa148844a257d0b478fa0005f2de9655d5d866b2c34ecafb87ad324c"} Apr 17 17:38:58.283873 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.283566 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-ks2c9" event={"ID":"f7dbf07b-d429-4de4-8339-5f95063c1fb0","Type":"ContainerDied","Data":"1a298273697667650d531aaef9d9a2ea30f32c5fd84653573fe3ae8769a94a16"} Apr 17 17:38:58.283873 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.283589 2571 scope.go:117] "RemoveContainer" containerID="800b1c51fa148844a257d0b478fa0005f2de9655d5d866b2c34ecafb87ad324c" Apr 17 17:38:58.283873 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.283734 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-ks2c9" Apr 17 17:38:58.287002 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.286976 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7f4ccd95db-xr65j" Apr 17 17:38:58.287754 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.287730 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-b8c556d58-8nt7f" event={"ID":"5276b25c-199a-4e78-b79d-b5991cc0a565","Type":"ContainerStarted","Data":"40c35c2c8f2a2c0963f1f844edf97ed1f08c085e811a88a83ffc10b672256b6b"} Apr 17 17:38:58.297187 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.297168 2571 scope.go:117] "RemoveContainer" containerID="800b1c51fa148844a257d0b478fa0005f2de9655d5d866b2c34ecafb87ad324c" Apr 17 17:38:58.297424 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:38:58.297406 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"800b1c51fa148844a257d0b478fa0005f2de9655d5d866b2c34ecafb87ad324c\": container with ID starting with 800b1c51fa148844a257d0b478fa0005f2de9655d5d866b2c34ecafb87ad324c not found: ID does not exist" containerID="800b1c51fa148844a257d0b478fa0005f2de9655d5d866b2c34ecafb87ad324c" Apr 17 17:38:58.297476 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.297435 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"800b1c51fa148844a257d0b478fa0005f2de9655d5d866b2c34ecafb87ad324c"} err="failed to get container status \"800b1c51fa148844a257d0b478fa0005f2de9655d5d866b2c34ecafb87ad324c\": rpc error: code = NotFound desc = could not find container \"800b1c51fa148844a257d0b478fa0005f2de9655d5d866b2c34ecafb87ad324c\": container with ID starting with 800b1c51fa148844a257d0b478fa0005f2de9655d5d866b2c34ecafb87ad324c not found: ID does not exist" Apr 17 17:38:58.305910 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.305873 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-b8c556d58-8nt7f" podStartSLOduration=1.9131392680000001 podStartE2EDuration="2.305861722s" podCreationTimestamp="2026-04-17 17:38:56 +0000 UTC" firstStartedPulling="2026-04-17 17:38:56.877201552 +0000 UTC m=+753.551054866" lastFinishedPulling="2026-04-17 17:38:57.269924004 +0000 UTC m=+753.943777320" observedRunningTime="2026-04-17 17:38:58.305607853 +0000 UTC m=+754.979461190" watchObservedRunningTime="2026-04-17 17:38:58.305861722 +0000 UTC m=+754.979715056" Apr 17 17:38:58.319550 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.319529 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-ks2c9"] Apr 17 17:38:58.324622 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.324594 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-ks2c9"] Apr 17 17:38:58.330207 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.330183 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-bjg8q"] Apr 17 17:38:58.330395 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.330375 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-bjg8q" podUID="c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202" containerName="authorino" containerID="cri-o://1f0af88dad266e5314f088b520e38440711a361fd78bd2d32d7810a52743c6e8" gracePeriod=30 Apr 17 17:38:58.355513 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.355438 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7f4ccd95db-xr65j"] Apr 17 17:38:58.358291 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.358264 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7f4ccd95db-xr65j"] Apr 17 17:38:58.570343 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.570324 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-bjg8q" Apr 17 17:38:58.720902 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.720877 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b968k\" (UniqueName: \"kubernetes.io/projected/c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202-kube-api-access-b968k\") pod \"c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202\" (UID: \"c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202\") " Apr 17 17:38:58.722792 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.722772 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202-kube-api-access-b968k" (OuterVolumeSpecName: "kube-api-access-b968k") pod "c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202" (UID: "c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202"). InnerVolumeSpecName "kube-api-access-b968k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:38:58.821388 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:58.821362 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b968k\" (UniqueName: \"kubernetes.io/projected/c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202-kube-api-access-b968k\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:38:59.292609 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:59.292577 2571 generic.go:358] "Generic (PLEG): container finished" podID="c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202" containerID="1f0af88dad266e5314f088b520e38440711a361fd78bd2d32d7810a52743c6e8" exitCode=0 Apr 17 17:38:59.293167 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:59.292649 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-bjg8q" Apr 17 17:38:59.293167 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:59.292653 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-bjg8q" event={"ID":"c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202","Type":"ContainerDied","Data":"1f0af88dad266e5314f088b520e38440711a361fd78bd2d32d7810a52743c6e8"} Apr 17 17:38:59.293167 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:59.292692 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-bjg8q" event={"ID":"c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202","Type":"ContainerDied","Data":"8b5aea8ae92a842bbb01eeaa7800d4c4e80ccb49e1fb151ab87289f651bfb10d"} Apr 17 17:38:59.293167 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:59.292726 2571 scope.go:117] "RemoveContainer" containerID="1f0af88dad266e5314f088b520e38440711a361fd78bd2d32d7810a52743c6e8" Apr 17 17:38:59.301545 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:59.301529 2571 scope.go:117] "RemoveContainer" containerID="1f0af88dad266e5314f088b520e38440711a361fd78bd2d32d7810a52743c6e8" Apr 17 17:38:59.301848 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:38:59.301823 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f0af88dad266e5314f088b520e38440711a361fd78bd2d32d7810a52743c6e8\": container with ID starting with 1f0af88dad266e5314f088b520e38440711a361fd78bd2d32d7810a52743c6e8 not found: ID does not exist" containerID="1f0af88dad266e5314f088b520e38440711a361fd78bd2d32d7810a52743c6e8" Apr 17 17:38:59.301938 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:59.301858 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0af88dad266e5314f088b520e38440711a361fd78bd2d32d7810a52743c6e8"} err="failed to get container status \"1f0af88dad266e5314f088b520e38440711a361fd78bd2d32d7810a52743c6e8\": rpc error: code = NotFound desc = could not find container \"1f0af88dad266e5314f088b520e38440711a361fd78bd2d32d7810a52743c6e8\": container with ID starting with 1f0af88dad266e5314f088b520e38440711a361fd78bd2d32d7810a52743c6e8 not found: ID does not exist" Apr 17 17:38:59.320773 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:59.320742 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-bjg8q"] Apr 17 17:38:59.325377 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:59.325356 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-bjg8q"] Apr 17 17:38:59.917288 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:59.917254 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1de2830-7657-49ab-acf9-9b6257eacd7f" path="/var/lib/kubelet/pods/b1de2830-7657-49ab-acf9-9b6257eacd7f/volumes" Apr 17 17:38:59.917479 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:59.917468 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202" path="/var/lib/kubelet/pods/c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202/volumes" Apr 17 17:38:59.917760 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:38:59.917749 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7dbf07b-d429-4de4-8339-5f95063c1fb0" path="/var/lib/kubelet/pods/f7dbf07b-d429-4de4-8339-5f95063c1fb0/volumes" Apr 17 17:40:12.180606 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.180566 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-6645455594-7b7hf"] Apr 17 17:40:12.181183 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.181110 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202" containerName="authorino" Apr 17 17:40:12.181183 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.181132 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202" containerName="authorino" Apr 17 17:40:12.181183 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.181144 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7dbf07b-d429-4de4-8339-5f95063c1fb0" containerName="authorino" Apr 17 17:40:12.181183 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.181153 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7dbf07b-d429-4de4-8339-5f95063c1fb0" containerName="authorino" Apr 17 17:40:12.181389 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.181285 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c54f45f0-bcb8-41cd-8fc3-a14cc2c3f202" containerName="authorino" Apr 17 17:40:12.181389 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.181304 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7dbf07b-d429-4de4-8339-5f95063c1fb0" containerName="authorino" Apr 17 17:40:12.184394 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.184373 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6645455594-7b7hf" Apr 17 17:40:12.187167 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.187144 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"authorino-oidc-ca\"" Apr 17 17:40:12.191762 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.191741 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6645455594-7b7hf"] Apr 17 17:40:12.208649 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.208627 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1d777ee4-ca0b-47d8-bab8-230dc27b628b-tls-cert\") pod \"authorino-6645455594-7b7hf\" (UID: \"1d777ee4-ca0b-47d8-bab8-230dc27b628b\") " pod="kuadrant-system/authorino-6645455594-7b7hf" Apr 17 17:40:12.208790 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.208666 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/1d777ee4-ca0b-47d8-bab8-230dc27b628b-oidc-ca\") pod \"authorino-6645455594-7b7hf\" (UID: \"1d777ee4-ca0b-47d8-bab8-230dc27b628b\") " pod="kuadrant-system/authorino-6645455594-7b7hf" Apr 17 17:40:12.208839 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.208826 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjb6s\" (UniqueName: \"kubernetes.io/projected/1d777ee4-ca0b-47d8-bab8-230dc27b628b-kube-api-access-gjb6s\") pod \"authorino-6645455594-7b7hf\" (UID: \"1d777ee4-ca0b-47d8-bab8-230dc27b628b\") " pod="kuadrant-system/authorino-6645455594-7b7hf" Apr 17 17:40:12.309402 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.309365 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjb6s\" (UniqueName: \"kubernetes.io/projected/1d777ee4-ca0b-47d8-bab8-230dc27b628b-kube-api-access-gjb6s\") pod \"authorino-6645455594-7b7hf\" (UID: \"1d777ee4-ca0b-47d8-bab8-230dc27b628b\") " pod="kuadrant-system/authorino-6645455594-7b7hf" Apr 17 17:40:12.309583 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.309417 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1d777ee4-ca0b-47d8-bab8-230dc27b628b-tls-cert\") pod \"authorino-6645455594-7b7hf\" (UID: \"1d777ee4-ca0b-47d8-bab8-230dc27b628b\") " pod="kuadrant-system/authorino-6645455594-7b7hf" Apr 17 17:40:12.309583 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.309447 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/1d777ee4-ca0b-47d8-bab8-230dc27b628b-oidc-ca\") pod \"authorino-6645455594-7b7hf\" (UID: \"1d777ee4-ca0b-47d8-bab8-230dc27b628b\") " pod="kuadrant-system/authorino-6645455594-7b7hf" Apr 17 17:40:12.310179 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.310143 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/1d777ee4-ca0b-47d8-bab8-230dc27b628b-oidc-ca\") pod \"authorino-6645455594-7b7hf\" (UID: \"1d777ee4-ca0b-47d8-bab8-230dc27b628b\") " pod="kuadrant-system/authorino-6645455594-7b7hf" Apr 17 17:40:12.311890 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.311864 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1d777ee4-ca0b-47d8-bab8-230dc27b628b-tls-cert\") pod \"authorino-6645455594-7b7hf\" (UID: \"1d777ee4-ca0b-47d8-bab8-230dc27b628b\") " pod="kuadrant-system/authorino-6645455594-7b7hf" Apr 17 17:40:12.317277 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.317253 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjb6s\" (UniqueName: \"kubernetes.io/projected/1d777ee4-ca0b-47d8-bab8-230dc27b628b-kube-api-access-gjb6s\") pod \"authorino-6645455594-7b7hf\" (UID: \"1d777ee4-ca0b-47d8-bab8-230dc27b628b\") " pod="kuadrant-system/authorino-6645455594-7b7hf" Apr 17 17:40:12.494867 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.494837 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6645455594-7b7hf" Apr 17 17:40:12.613569 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:12.613549 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6645455594-7b7hf"] Apr 17 17:40:12.615414 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:40:12.615389 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d777ee4_ca0b_47d8_bab8_230dc27b628b.slice/crio-45faf68a69dd16eea8d22464164af03a69011e5d1cd103edbea5e07abe5ec147 WatchSource:0}: Error finding container 45faf68a69dd16eea8d22464164af03a69011e5d1cd103edbea5e07abe5ec147: Status 404 returned error can't find the container with id 45faf68a69dd16eea8d22464164af03a69011e5d1cd103edbea5e07abe5ec147 Apr 17 17:40:13.547632 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:13.547591 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6645455594-7b7hf" event={"ID":"1d777ee4-ca0b-47d8-bab8-230dc27b628b","Type":"ContainerStarted","Data":"1d7adfe41808695180206877810d648e0a5c5076c9c0640ff6c5a08c2b2b6c21"} Apr 17 17:40:13.547632 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:13.547638 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6645455594-7b7hf" event={"ID":"1d777ee4-ca0b-47d8-bab8-230dc27b628b","Type":"ContainerStarted","Data":"45faf68a69dd16eea8d22464164af03a69011e5d1cd103edbea5e07abe5ec147"} Apr 17 17:40:13.563954 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:13.563901 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-6645455594-7b7hf" podStartSLOduration=0.97920521 podStartE2EDuration="1.563882481s" podCreationTimestamp="2026-04-17 17:40:12 +0000 UTC" firstStartedPulling="2026-04-17 17:40:12.616769347 +0000 UTC m=+829.290622664" lastFinishedPulling="2026-04-17 17:40:13.201446621 +0000 UTC m=+829.875299935" observedRunningTime="2026-04-17 17:40:13.562450572 +0000 UTC m=+830.236303908" watchObservedRunningTime="2026-04-17 17:40:13.563882481 +0000 UTC m=+830.237735818" Apr 17 17:40:13.594925 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:13.594886 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-b8c556d58-8nt7f"] Apr 17 17:40:13.595433 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:13.595405 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-b8c556d58-8nt7f" podUID="5276b25c-199a-4e78-b79d-b5991cc0a565" containerName="authorino" containerID="cri-o://40c35c2c8f2a2c0963f1f844edf97ed1f08c085e811a88a83ffc10b672256b6b" gracePeriod=30 Apr 17 17:40:13.856610 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:13.856590 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-b8c556d58-8nt7f" Apr 17 17:40:13.923811 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:13.923770 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf8f7\" (UniqueName: \"kubernetes.io/projected/5276b25c-199a-4e78-b79d-b5991cc0a565-kube-api-access-jf8f7\") pod \"5276b25c-199a-4e78-b79d-b5991cc0a565\" (UID: \"5276b25c-199a-4e78-b79d-b5991cc0a565\") " Apr 17 17:40:13.923993 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:13.923896 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/5276b25c-199a-4e78-b79d-b5991cc0a565-tls-cert\") pod \"5276b25c-199a-4e78-b79d-b5991cc0a565\" (UID: \"5276b25c-199a-4e78-b79d-b5991cc0a565\") " Apr 17 17:40:13.925993 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:13.925963 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5276b25c-199a-4e78-b79d-b5991cc0a565-kube-api-access-jf8f7" (OuterVolumeSpecName: "kube-api-access-jf8f7") pod "5276b25c-199a-4e78-b79d-b5991cc0a565" (UID: "5276b25c-199a-4e78-b79d-b5991cc0a565"). InnerVolumeSpecName "kube-api-access-jf8f7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:40:13.933746 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:13.933725 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5276b25c-199a-4e78-b79d-b5991cc0a565-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "5276b25c-199a-4e78-b79d-b5991cc0a565" (UID: "5276b25c-199a-4e78-b79d-b5991cc0a565"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:40:14.024804 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:14.024771 2571 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/5276b25c-199a-4e78-b79d-b5991cc0a565-tls-cert\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:40:14.024804 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:14.024800 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jf8f7\" (UniqueName: \"kubernetes.io/projected/5276b25c-199a-4e78-b79d-b5991cc0a565-kube-api-access-jf8f7\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:40:14.552005 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:14.551967 2571 generic.go:358] "Generic (PLEG): container finished" podID="5276b25c-199a-4e78-b79d-b5991cc0a565" containerID="40c35c2c8f2a2c0963f1f844edf97ed1f08c085e811a88a83ffc10b672256b6b" exitCode=0 Apr 17 17:40:14.552399 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:14.552017 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-b8c556d58-8nt7f" Apr 17 17:40:14.552399 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:14.552035 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-b8c556d58-8nt7f" event={"ID":"5276b25c-199a-4e78-b79d-b5991cc0a565","Type":"ContainerDied","Data":"40c35c2c8f2a2c0963f1f844edf97ed1f08c085e811a88a83ffc10b672256b6b"} Apr 17 17:40:14.552399 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:14.552073 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-b8c556d58-8nt7f" event={"ID":"5276b25c-199a-4e78-b79d-b5991cc0a565","Type":"ContainerDied","Data":"f5488c197cce253691fa66a09d18d92b8fb57e3b0481afe6d7cb0901f299ed12"} Apr 17 17:40:14.552399 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:14.552094 2571 scope.go:117] "RemoveContainer" containerID="40c35c2c8f2a2c0963f1f844edf97ed1f08c085e811a88a83ffc10b672256b6b" Apr 17 17:40:14.560304 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:14.560285 2571 scope.go:117] "RemoveContainer" containerID="40c35c2c8f2a2c0963f1f844edf97ed1f08c085e811a88a83ffc10b672256b6b" Apr 17 17:40:14.560556 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:40:14.560539 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40c35c2c8f2a2c0963f1f844edf97ed1f08c085e811a88a83ffc10b672256b6b\": container with ID starting with 40c35c2c8f2a2c0963f1f844edf97ed1f08c085e811a88a83ffc10b672256b6b not found: ID does not exist" containerID="40c35c2c8f2a2c0963f1f844edf97ed1f08c085e811a88a83ffc10b672256b6b" Apr 17 17:40:14.560609 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:14.560566 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40c35c2c8f2a2c0963f1f844edf97ed1f08c085e811a88a83ffc10b672256b6b"} err="failed to get container status \"40c35c2c8f2a2c0963f1f844edf97ed1f08c085e811a88a83ffc10b672256b6b\": rpc error: code = NotFound desc = could not find container \"40c35c2c8f2a2c0963f1f844edf97ed1f08c085e811a88a83ffc10b672256b6b\": container with ID starting with 40c35c2c8f2a2c0963f1f844edf97ed1f08c085e811a88a83ffc10b672256b6b not found: ID does not exist" Apr 17 17:40:14.575816 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:14.575769 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-b8c556d58-8nt7f"] Apr 17 17:40:14.583024 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:14.583005 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-b8c556d58-8nt7f"] Apr 17 17:40:15.917084 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:40:15.917051 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5276b25c-199a-4e78-b79d-b5991cc0a565" path="/var/lib/kubelet/pods/5276b25c-199a-4e78-b79d-b5991cc0a565/volumes" Apr 17 17:42:07.903861 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:07.903830 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-86f479759f-gmmvw"] Apr 17 17:42:07.906234 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:07.904190 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5276b25c-199a-4e78-b79d-b5991cc0a565" containerName="authorino" Apr 17 17:42:07.906234 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:07.904201 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5276b25c-199a-4e78-b79d-b5991cc0a565" containerName="authorino" Apr 17 17:42:07.906234 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:07.904269 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5276b25c-199a-4e78-b79d-b5991cc0a565" containerName="authorino" Apr 17 17:42:07.907123 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:07.907108 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-86f479759f-gmmvw" Apr 17 17:42:07.924320 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:07.924293 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-86f479759f-gmmvw"] Apr 17 17:42:08.038733 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:08.038689 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rgbj\" (UniqueName: \"kubernetes.io/projected/3b88b667-2264-4548-a4ff-8129b04fc720-kube-api-access-5rgbj\") pod \"authorino-86f479759f-gmmvw\" (UID: \"3b88b667-2264-4548-a4ff-8129b04fc720\") " pod="kuadrant-system/authorino-86f479759f-gmmvw" Apr 17 17:42:08.038733 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:08.038742 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/3b88b667-2264-4548-a4ff-8129b04fc720-oidc-ca\") pod \"authorino-86f479759f-gmmvw\" (UID: \"3b88b667-2264-4548-a4ff-8129b04fc720\") " pod="kuadrant-system/authorino-86f479759f-gmmvw" Apr 17 17:42:08.038933 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:08.038834 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/3b88b667-2264-4548-a4ff-8129b04fc720-tls-cert\") pod \"authorino-86f479759f-gmmvw\" (UID: \"3b88b667-2264-4548-a4ff-8129b04fc720\") " pod="kuadrant-system/authorino-86f479759f-gmmvw" Apr 17 17:42:08.139414 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:08.139385 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rgbj\" (UniqueName: \"kubernetes.io/projected/3b88b667-2264-4548-a4ff-8129b04fc720-kube-api-access-5rgbj\") pod \"authorino-86f479759f-gmmvw\" (UID: \"3b88b667-2264-4548-a4ff-8129b04fc720\") " pod="kuadrant-system/authorino-86f479759f-gmmvw" Apr 17 17:42:08.139414 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:08.139419 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/3b88b667-2264-4548-a4ff-8129b04fc720-oidc-ca\") pod \"authorino-86f479759f-gmmvw\" (UID: \"3b88b667-2264-4548-a4ff-8129b04fc720\") " pod="kuadrant-system/authorino-86f479759f-gmmvw" Apr 17 17:42:08.139546 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:08.139456 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/3b88b667-2264-4548-a4ff-8129b04fc720-tls-cert\") pod \"authorino-86f479759f-gmmvw\" (UID: \"3b88b667-2264-4548-a4ff-8129b04fc720\") " pod="kuadrant-system/authorino-86f479759f-gmmvw" Apr 17 17:42:08.140086 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:08.140069 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/3b88b667-2264-4548-a4ff-8129b04fc720-oidc-ca\") pod \"authorino-86f479759f-gmmvw\" (UID: \"3b88b667-2264-4548-a4ff-8129b04fc720\") " pod="kuadrant-system/authorino-86f479759f-gmmvw" Apr 17 17:42:08.141743 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:08.141728 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/3b88b667-2264-4548-a4ff-8129b04fc720-tls-cert\") pod \"authorino-86f479759f-gmmvw\" (UID: \"3b88b667-2264-4548-a4ff-8129b04fc720\") " pod="kuadrant-system/authorino-86f479759f-gmmvw" Apr 17 17:42:08.147528 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:08.147508 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rgbj\" (UniqueName: \"kubernetes.io/projected/3b88b667-2264-4548-a4ff-8129b04fc720-kube-api-access-5rgbj\") pod \"authorino-86f479759f-gmmvw\" (UID: \"3b88b667-2264-4548-a4ff-8129b04fc720\") " pod="kuadrant-system/authorino-86f479759f-gmmvw" Apr 17 17:42:08.217343 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:08.217325 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-86f479759f-gmmvw" Apr 17 17:42:08.344654 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:08.344632 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-86f479759f-gmmvw"] Apr 17 17:42:08.346472 ip-10-0-138-42 kubenswrapper[2571]: W0417 17:42:08.346444 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b88b667_2264_4548_a4ff_8129b04fc720.slice/crio-bb14cdcec5b41caee48c222e17f92124791a98295aa59ae93e287bbf9b2d525d WatchSource:0}: Error finding container bb14cdcec5b41caee48c222e17f92124791a98295aa59ae93e287bbf9b2d525d: Status 404 returned error can't find the container with id bb14cdcec5b41caee48c222e17f92124791a98295aa59ae93e287bbf9b2d525d Apr 17 17:42:08.933485 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:08.933450 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-86f479759f-gmmvw" event={"ID":"3b88b667-2264-4548-a4ff-8129b04fc720","Type":"ContainerStarted","Data":"7939871e9322e68cdb1772afb3f8e54ab0fa9e30d089018fb2f772af35697a7c"} Apr 17 17:42:08.933485 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:08.933485 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-86f479759f-gmmvw" event={"ID":"3b88b667-2264-4548-a4ff-8129b04fc720","Type":"ContainerStarted","Data":"bb14cdcec5b41caee48c222e17f92124791a98295aa59ae93e287bbf9b2d525d"} Apr 17 17:42:08.951129 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:08.951078 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-86f479759f-gmmvw" podStartSLOduration=1.56076345 podStartE2EDuration="1.951061808s" podCreationTimestamp="2026-04-17 17:42:07 +0000 UTC" firstStartedPulling="2026-04-17 17:42:08.347731752 +0000 UTC m=+945.021585066" lastFinishedPulling="2026-04-17 17:42:08.73803011 +0000 UTC m=+945.411883424" observedRunningTime="2026-04-17 17:42:08.94972037 +0000 UTC m=+945.623573704" watchObservedRunningTime="2026-04-17 17:42:08.951061808 +0000 UTC m=+945.624915145" Apr 17 17:42:08.981129 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:08.981093 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6645455594-7b7hf"] Apr 17 17:42:08.981417 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:08.981366 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-6645455594-7b7hf" podUID="1d777ee4-ca0b-47d8-bab8-230dc27b628b" containerName="authorino" containerID="cri-o://1d7adfe41808695180206877810d648e0a5c5076c9c0640ff6c5a08c2b2b6c21" gracePeriod=30 Apr 17 17:42:09.237419 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.237391 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6645455594-7b7hf" Apr 17 17:42:09.349478 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.349383 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjb6s\" (UniqueName: \"kubernetes.io/projected/1d777ee4-ca0b-47d8-bab8-230dc27b628b-kube-api-access-gjb6s\") pod \"1d777ee4-ca0b-47d8-bab8-230dc27b628b\" (UID: \"1d777ee4-ca0b-47d8-bab8-230dc27b628b\") " Apr 17 17:42:09.349478 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.349438 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1d777ee4-ca0b-47d8-bab8-230dc27b628b-tls-cert\") pod \"1d777ee4-ca0b-47d8-bab8-230dc27b628b\" (UID: \"1d777ee4-ca0b-47d8-bab8-230dc27b628b\") " Apr 17 17:42:09.349691 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.349567 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/1d777ee4-ca0b-47d8-bab8-230dc27b628b-oidc-ca\") pod \"1d777ee4-ca0b-47d8-bab8-230dc27b628b\" (UID: \"1d777ee4-ca0b-47d8-bab8-230dc27b628b\") " Apr 17 17:42:09.352080 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.352049 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d777ee4-ca0b-47d8-bab8-230dc27b628b-kube-api-access-gjb6s" (OuterVolumeSpecName: "kube-api-access-gjb6s") pod "1d777ee4-ca0b-47d8-bab8-230dc27b628b" (UID: "1d777ee4-ca0b-47d8-bab8-230dc27b628b"). InnerVolumeSpecName "kube-api-access-gjb6s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:42:09.355669 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.355641 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d777ee4-ca0b-47d8-bab8-230dc27b628b-oidc-ca" (OuterVolumeSpecName: "oidc-ca") pod "1d777ee4-ca0b-47d8-bab8-230dc27b628b" (UID: "1d777ee4-ca0b-47d8-bab8-230dc27b628b"). InnerVolumeSpecName "oidc-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:42:09.360984 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.360956 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d777ee4-ca0b-47d8-bab8-230dc27b628b-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "1d777ee4-ca0b-47d8-bab8-230dc27b628b" (UID: "1d777ee4-ca0b-47d8-bab8-230dc27b628b"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:42:09.450521 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.450488 2571 reconciler_common.go:299] "Volume detached for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/1d777ee4-ca0b-47d8-bab8-230dc27b628b-oidc-ca\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:42:09.450521 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.450516 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gjb6s\" (UniqueName: \"kubernetes.io/projected/1d777ee4-ca0b-47d8-bab8-230dc27b628b-kube-api-access-gjb6s\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:42:09.450521 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.450529 2571 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1d777ee4-ca0b-47d8-bab8-230dc27b628b-tls-cert\") on node \"ip-10-0-138-42.ec2.internal\" DevicePath \"\"" Apr 17 17:42:09.937204 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.937175 2571 generic.go:358] "Generic (PLEG): container finished" podID="1d777ee4-ca0b-47d8-bab8-230dc27b628b" containerID="1d7adfe41808695180206877810d648e0a5c5076c9c0640ff6c5a08c2b2b6c21" exitCode=0 Apr 17 17:42:09.937570 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.937210 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6645455594-7b7hf" Apr 17 17:42:09.937570 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.937266 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6645455594-7b7hf" event={"ID":"1d777ee4-ca0b-47d8-bab8-230dc27b628b","Type":"ContainerDied","Data":"1d7adfe41808695180206877810d648e0a5c5076c9c0640ff6c5a08c2b2b6c21"} Apr 17 17:42:09.937570 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.937300 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6645455594-7b7hf" event={"ID":"1d777ee4-ca0b-47d8-bab8-230dc27b628b","Type":"ContainerDied","Data":"45faf68a69dd16eea8d22464164af03a69011e5d1cd103edbea5e07abe5ec147"} Apr 17 17:42:09.937570 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.937326 2571 scope.go:117] "RemoveContainer" containerID="1d7adfe41808695180206877810d648e0a5c5076c9c0640ff6c5a08c2b2b6c21" Apr 17 17:42:09.945288 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.945273 2571 scope.go:117] "RemoveContainer" containerID="1d7adfe41808695180206877810d648e0a5c5076c9c0640ff6c5a08c2b2b6c21" Apr 17 17:42:09.945517 ip-10-0-138-42 kubenswrapper[2571]: E0417 17:42:09.945498 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d7adfe41808695180206877810d648e0a5c5076c9c0640ff6c5a08c2b2b6c21\": container with ID starting with 1d7adfe41808695180206877810d648e0a5c5076c9c0640ff6c5a08c2b2b6c21 not found: ID does not exist" containerID="1d7adfe41808695180206877810d648e0a5c5076c9c0640ff6c5a08c2b2b6c21" Apr 17 17:42:09.945586 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.945524 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d7adfe41808695180206877810d648e0a5c5076c9c0640ff6c5a08c2b2b6c21"} err="failed to get container status \"1d7adfe41808695180206877810d648e0a5c5076c9c0640ff6c5a08c2b2b6c21\": rpc error: code = NotFound desc = could not find container \"1d7adfe41808695180206877810d648e0a5c5076c9c0640ff6c5a08c2b2b6c21\": container with ID starting with 1d7adfe41808695180206877810d648e0a5c5076c9c0640ff6c5a08c2b2b6c21 not found: ID does not exist" Apr 17 17:42:09.962508 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.962484 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6645455594-7b7hf"] Apr 17 17:42:09.978235 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:09.978215 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-6645455594-7b7hf"] Apr 17 17:42:11.916759 ip-10-0-138-42 kubenswrapper[2571]: I0417 17:42:11.916724 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d777ee4-ca0b-47d8-bab8-230dc27b628b" path="/var/lib/kubelet/pods/1d777ee4-ca0b-47d8-bab8-230dc27b628b/volumes" Apr 17 18:02:16.892590 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:16.892500 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-86f479759f-gmmvw_3b88b667-2264-4548-a4ff-8129b04fc720/authorino/0.log" Apr 17 18:02:21.568674 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:21.568640 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-77fb85d776-wqcvj_72f44ff3-707a-444a-bfc9-5dd333f3568a/manager/0.log" Apr 17 18:02:21.683861 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:21.683814 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-j4tff_9f9e55fa-b286-428a-a040-5ea7dd918f33/postgres/0.log" Apr 17 18:02:23.015433 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:23.015387 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-86f479759f-gmmvw_3b88b667-2264-4548-a4ff-8129b04fc720/authorino/0.log" Apr 17 18:02:23.251815 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:23.251768 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-7rbxr_05903816-57b3-4c2f-a474-39431e9e4315/manager/0.log" Apr 17 18:02:23.492207 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:23.492172 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-2z7gq_657343ce-d367-433d-95fc-af04f60b3050/registry-server/0.log" Apr 17 18:02:23.879925 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:23.879830 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-z2xtr_4db4bee4-0928-4b25-bae2-4ceac22bdf41/manager/0.log" Apr 17 18:02:24.226799 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:24.226769 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds_d79f2435-2abc-482c-a314-824069d12177/istio-proxy/0.log" Apr 17 18:02:24.703867 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:24.703779 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-p962x_30a1fd64-c3a6-42fe-a9d7-f0869357a3d0/istio-proxy/0.log" Apr 17 18:02:29.362399 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.362369 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8rk54/must-gather-rcqv7"] Apr 17 18:02:29.362785 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.362768 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d777ee4-ca0b-47d8-bab8-230dc27b628b" containerName="authorino" Apr 17 18:02:29.362785 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.362780 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d777ee4-ca0b-47d8-bab8-230dc27b628b" containerName="authorino" Apr 17 18:02:29.362861 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.362839 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d777ee4-ca0b-47d8-bab8-230dc27b628b" containerName="authorino" Apr 17 18:02:29.365908 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.365892 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rk54/must-gather-rcqv7" Apr 17 18:02:29.369685 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.369663 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8rk54\"/\"default-dockercfg-tkwzj\"" Apr 17 18:02:29.370951 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.370933 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8rk54\"/\"kube-root-ca.crt\"" Apr 17 18:02:29.371032 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.370942 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8rk54\"/\"openshift-service-ca.crt\"" Apr 17 18:02:29.376827 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.376805 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8rk54/must-gather-rcqv7"] Apr 17 18:02:29.384592 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.384567 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvfk9\" (UniqueName: \"kubernetes.io/projected/2e61eac4-a776-4b9c-950c-16b14cb52dae-kube-api-access-qvfk9\") pod \"must-gather-rcqv7\" (UID: \"2e61eac4-a776-4b9c-950c-16b14cb52dae\") " pod="openshift-must-gather-8rk54/must-gather-rcqv7" Apr 17 18:02:29.384714 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.384619 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e61eac4-a776-4b9c-950c-16b14cb52dae-must-gather-output\") pod \"must-gather-rcqv7\" (UID: \"2e61eac4-a776-4b9c-950c-16b14cb52dae\") " pod="openshift-must-gather-8rk54/must-gather-rcqv7" Apr 17 18:02:29.485429 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.485395 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvfk9\" (UniqueName: \"kubernetes.io/projected/2e61eac4-a776-4b9c-950c-16b14cb52dae-kube-api-access-qvfk9\") pod \"must-gather-rcqv7\" (UID: \"2e61eac4-a776-4b9c-950c-16b14cb52dae\") " pod="openshift-must-gather-8rk54/must-gather-rcqv7" Apr 17 18:02:29.485624 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.485551 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e61eac4-a776-4b9c-950c-16b14cb52dae-must-gather-output\") pod \"must-gather-rcqv7\" (UID: \"2e61eac4-a776-4b9c-950c-16b14cb52dae\") " pod="openshift-must-gather-8rk54/must-gather-rcqv7" Apr 17 18:02:29.485926 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.485907 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e61eac4-a776-4b9c-950c-16b14cb52dae-must-gather-output\") pod \"must-gather-rcqv7\" (UID: \"2e61eac4-a776-4b9c-950c-16b14cb52dae\") " pod="openshift-must-gather-8rk54/must-gather-rcqv7" Apr 17 18:02:29.500220 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.500192 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvfk9\" (UniqueName: \"kubernetes.io/projected/2e61eac4-a776-4b9c-950c-16b14cb52dae-kube-api-access-qvfk9\") pod \"must-gather-rcqv7\" (UID: \"2e61eac4-a776-4b9c-950c-16b14cb52dae\") " pod="openshift-must-gather-8rk54/must-gather-rcqv7" Apr 17 18:02:29.675851 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.675761 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rk54/must-gather-rcqv7" Apr 17 18:02:29.800511 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.800485 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8rk54/must-gather-rcqv7"] Apr 17 18:02:29.802676 ip-10-0-138-42 kubenswrapper[2571]: W0417 18:02:29.802646 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e61eac4_a776_4b9c_950c_16b14cb52dae.slice/crio-ff77aac89a5bbe715299d8a92cc3f3c4567747ee046af1df01b0a9288b68ae53 WatchSource:0}: Error finding container ff77aac89a5bbe715299d8a92cc3f3c4567747ee046af1df01b0a9288b68ae53: Status 404 returned error can't find the container with id ff77aac89a5bbe715299d8a92cc3f3c4567747ee046af1df01b0a9288b68ae53 Apr 17 18:02:29.804468 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:29.804448 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:02:30.167308 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:30.167275 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rk54/must-gather-rcqv7" event={"ID":"2e61eac4-a776-4b9c-950c-16b14cb52dae","Type":"ContainerStarted","Data":"ff77aac89a5bbe715299d8a92cc3f3c4567747ee046af1df01b0a9288b68ae53"} Apr 17 18:02:31.176265 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:31.175346 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rk54/must-gather-rcqv7" event={"ID":"2e61eac4-a776-4b9c-950c-16b14cb52dae","Type":"ContainerStarted","Data":"6f1f36a336c276d812cfeaf7242cc01ad2105f31ac25b8f3ff43b4d5ce4a168a"} Apr 17 18:02:31.176265 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:31.175395 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rk54/must-gather-rcqv7" event={"ID":"2e61eac4-a776-4b9c-950c-16b14cb52dae","Type":"ContainerStarted","Data":"c837f65edc03c7d0152abc1246398ebf1009119486c2fde57b3bfaa9240b3a99"} Apr 17 18:02:31.215444 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:31.215382 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8rk54/must-gather-rcqv7" podStartSLOduration=1.320911611 podStartE2EDuration="2.215365322s" podCreationTimestamp="2026-04-17 18:02:29 +0000 UTC" firstStartedPulling="2026-04-17 18:02:29.804609085 +0000 UTC m=+2166.478462400" lastFinishedPulling="2026-04-17 18:02:30.699062797 +0000 UTC m=+2167.372916111" observedRunningTime="2026-04-17 18:02:31.214925274 +0000 UTC m=+2167.888778609" watchObservedRunningTime="2026-04-17 18:02:31.215365322 +0000 UTC m=+2167.889218658" Apr 17 18:02:32.431806 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:32.431766 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-4czqt_203be660-71e2-4aef-9c05-185d66f59ebb/global-pull-secret-syncer/0.log" Apr 17 18:02:32.594077 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:32.594041 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nd4xv_a0b85ad2-2ebd-48c9-8952-e5d89d308e59/konnectivity-agent/0.log" Apr 17 18:02:32.714949 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:32.714912 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-42.ec2.internal_12ee1495c5f896624a48870a38fd93a3/haproxy/0.log" Apr 17 18:02:36.647778 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:36.647105 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-86f479759f-gmmvw_3b88b667-2264-4548-a4ff-8129b04fc720/authorino/0.log" Apr 17 18:02:36.712603 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:36.712569 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-7rbxr_05903816-57b3-4c2f-a474-39431e9e4315/manager/0.log" Apr 17 18:02:36.785838 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:36.785799 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-2z7gq_657343ce-d367-433d-95fc-af04f60b3050/registry-server/0.log" Apr 17 18:02:37.027495 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:37.027452 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-z2xtr_4db4bee4-0928-4b25-bae2-4ceac22bdf41/manager/0.log" Apr 17 18:02:38.320293 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:38.320258 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d6ed31df-b9fd-4360-9928-dee0dfd5ecfd/alertmanager/0.log" Apr 17 18:02:38.346899 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:38.346866 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d6ed31df-b9fd-4360-9928-dee0dfd5ecfd/config-reloader/0.log" Apr 17 18:02:38.371650 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:38.371614 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d6ed31df-b9fd-4360-9928-dee0dfd5ecfd/kube-rbac-proxy-web/0.log" Apr 17 18:02:38.397584 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:38.397549 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d6ed31df-b9fd-4360-9928-dee0dfd5ecfd/kube-rbac-proxy/0.log" Apr 17 18:02:38.425922 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:38.425891 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d6ed31df-b9fd-4360-9928-dee0dfd5ecfd/kube-rbac-proxy-metric/0.log" Apr 17 18:02:38.452012 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:38.451984 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d6ed31df-b9fd-4360-9928-dee0dfd5ecfd/prom-label-proxy/0.log" Apr 17 18:02:38.479302 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:38.479268 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d6ed31df-b9fd-4360-9928-dee0dfd5ecfd/init-config-reloader/0.log" Apr 17 18:02:38.712126 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:38.712078 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5qcf7_dd4675a1-0ff5-4c27-a0da-329554311931/node-exporter/0.log" Apr 17 18:02:38.745716 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:38.745675 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5qcf7_dd4675a1-0ff5-4c27-a0da-329554311931/kube-rbac-proxy/0.log" Apr 17 18:02:38.786673 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:38.786640 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5qcf7_dd4675a1-0ff5-4c27-a0da-329554311931/init-textfile/0.log" Apr 17 18:02:39.543744 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:39.543691 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-vlqmq_b7058f2c-5b01-4ef0-a631-06472e24ae23/prometheus-operator/0.log" Apr 17 18:02:39.574580 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:39.574553 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-vlqmq_b7058f2c-5b01-4ef0-a631-06472e24ae23/kube-rbac-proxy/0.log" Apr 17 18:02:39.606026 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:39.605982 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-25d7j_99b4f532-1014-4577-9b72-1fbd93101e6a/prometheus-operator-admission-webhook/0.log" Apr 17 18:02:39.641596 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:39.641549 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-9f65df4fc-5wqfp_5744fb14-ebdf-446f-8539-4368921db8c4/telemeter-client/0.log" Apr 17 18:02:39.668575 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:39.668541 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-9f65df4fc-5wqfp_5744fb14-ebdf-446f-8539-4368921db8c4/reload/0.log" Apr 17 18:02:39.696970 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:39.696928 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-9f65df4fc-5wqfp_5744fb14-ebdf-446f-8539-4368921db8c4/kube-rbac-proxy/0.log" Apr 17 18:02:39.737230 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:39.737197 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-75d47484b8-fqdcg_924523ba-7166-413d-b895-964f97a258a0/thanos-query/0.log" Apr 17 18:02:39.766600 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:39.766564 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-75d47484b8-fqdcg_924523ba-7166-413d-b895-964f97a258a0/kube-rbac-proxy-web/0.log" Apr 17 18:02:39.798468 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:39.798386 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-75d47484b8-fqdcg_924523ba-7166-413d-b895-964f97a258a0/kube-rbac-proxy/0.log" Apr 17 18:02:39.827853 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:39.827730 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-75d47484b8-fqdcg_924523ba-7166-413d-b895-964f97a258a0/prom-label-proxy/0.log" Apr 17 18:02:39.855845 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:39.855810 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-75d47484b8-fqdcg_924523ba-7166-413d-b895-964f97a258a0/kube-rbac-proxy-rules/0.log" Apr 17 18:02:39.883665 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:39.883628 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-75d47484b8-fqdcg_924523ba-7166-413d-b895-964f97a258a0/kube-rbac-proxy-metrics/0.log" Apr 17 18:02:41.066193 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.066156 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-tp7vv_fe1425c3-7bb7-4eea-8893-f2088028e46e/networking-console-plugin/0.log" Apr 17 18:02:41.255556 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.255519 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz"] Apr 17 18:02:41.260498 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.260470 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:41.266173 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.266143 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz"] Apr 17 18:02:41.305832 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.305795 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gvdz\" (UniqueName: \"kubernetes.io/projected/25a4c3e1-4f97-4de6-813b-832c72d0d9a8-kube-api-access-7gvdz\") pod \"perf-node-gather-daemonset-wp8tz\" (UID: \"25a4c3e1-4f97-4de6-813b-832c72d0d9a8\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:41.306042 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.305841 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/25a4c3e1-4f97-4de6-813b-832c72d0d9a8-proc\") pod \"perf-node-gather-daemonset-wp8tz\" (UID: \"25a4c3e1-4f97-4de6-813b-832c72d0d9a8\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:41.306042 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.305879 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/25a4c3e1-4f97-4de6-813b-832c72d0d9a8-podres\") pod \"perf-node-gather-daemonset-wp8tz\" (UID: \"25a4c3e1-4f97-4de6-813b-832c72d0d9a8\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:41.306042 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.305917 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/25a4c3e1-4f97-4de6-813b-832c72d0d9a8-sys\") pod \"perf-node-gather-daemonset-wp8tz\" (UID: \"25a4c3e1-4f97-4de6-813b-832c72d0d9a8\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:41.306042 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.305945 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/25a4c3e1-4f97-4de6-813b-832c72d0d9a8-lib-modules\") pod \"perf-node-gather-daemonset-wp8tz\" (UID: \"25a4c3e1-4f97-4de6-813b-832c72d0d9a8\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:41.406549 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.406450 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/25a4c3e1-4f97-4de6-813b-832c72d0d9a8-lib-modules\") pod \"perf-node-gather-daemonset-wp8tz\" (UID: \"25a4c3e1-4f97-4de6-813b-832c72d0d9a8\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:41.406549 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.406546 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gvdz\" (UniqueName: \"kubernetes.io/projected/25a4c3e1-4f97-4de6-813b-832c72d0d9a8-kube-api-access-7gvdz\") pod \"perf-node-gather-daemonset-wp8tz\" (UID: \"25a4c3e1-4f97-4de6-813b-832c72d0d9a8\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:41.406819 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.406572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/25a4c3e1-4f97-4de6-813b-832c72d0d9a8-proc\") pod \"perf-node-gather-daemonset-wp8tz\" (UID: \"25a4c3e1-4f97-4de6-813b-832c72d0d9a8\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:41.406819 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.406592 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/25a4c3e1-4f97-4de6-813b-832c72d0d9a8-podres\") pod \"perf-node-gather-daemonset-wp8tz\" (UID: \"25a4c3e1-4f97-4de6-813b-832c72d0d9a8\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:41.406819 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.406669 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/25a4c3e1-4f97-4de6-813b-832c72d0d9a8-lib-modules\") pod \"perf-node-gather-daemonset-wp8tz\" (UID: \"25a4c3e1-4f97-4de6-813b-832c72d0d9a8\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:41.406819 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.406676 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/25a4c3e1-4f97-4de6-813b-832c72d0d9a8-proc\") pod \"perf-node-gather-daemonset-wp8tz\" (UID: \"25a4c3e1-4f97-4de6-813b-832c72d0d9a8\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:41.406819 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.406747 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/25a4c3e1-4f97-4de6-813b-832c72d0d9a8-sys\") pod \"perf-node-gather-daemonset-wp8tz\" (UID: \"25a4c3e1-4f97-4de6-813b-832c72d0d9a8\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:41.406992 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.406821 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/25a4c3e1-4f97-4de6-813b-832c72d0d9a8-podres\") pod \"perf-node-gather-daemonset-wp8tz\" (UID: \"25a4c3e1-4f97-4de6-813b-832c72d0d9a8\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:41.406992 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.406867 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/25a4c3e1-4f97-4de6-813b-832c72d0d9a8-sys\") pod \"perf-node-gather-daemonset-wp8tz\" (UID: \"25a4c3e1-4f97-4de6-813b-832c72d0d9a8\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:41.417264 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.417227 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gvdz\" (UniqueName: \"kubernetes.io/projected/25a4c3e1-4f97-4de6-813b-832c72d0d9a8-kube-api-access-7gvdz\") pod \"perf-node-gather-daemonset-wp8tz\" (UID: \"25a4c3e1-4f97-4de6-813b-832c72d0d9a8\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:41.574010 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.573966 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:41.733503 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:41.733469 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz"] Apr 17 18:02:41.736882 ip-10-0-138-42 kubenswrapper[2571]: W0417 18:02:41.736849 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod25a4c3e1_4f97_4de6_813b_832c72d0d9a8.slice/crio-8140673d9e95ec13db98fe9ef52674d44d3825c440cb81388df358f469f35e1e WatchSource:0}: Error finding container 8140673d9e95ec13db98fe9ef52674d44d3825c440cb81388df358f469f35e1e: Status 404 returned error can't find the container with id 8140673d9e95ec13db98fe9ef52674d44d3825c440cb81388df358f469f35e1e Apr 17 18:02:42.234688 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:42.234649 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" event={"ID":"25a4c3e1-4f97-4de6-813b-832c72d0d9a8","Type":"ContainerStarted","Data":"0f872b70ec821aa6bda1b5e394e0cc98f67515cf343f6dfd9b6a7c7e8326d95d"} Apr 17 18:02:42.235141 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:42.234691 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" event={"ID":"25a4c3e1-4f97-4de6-813b-832c72d0d9a8","Type":"ContainerStarted","Data":"8140673d9e95ec13db98fe9ef52674d44d3825c440cb81388df358f469f35e1e"} Apr 17 18:02:42.235141 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:42.234808 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:42.253680 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:42.253625 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" podStartSLOduration=1.253606818 podStartE2EDuration="1.253606818s" podCreationTimestamp="2026-04-17 18:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:02:42.252253946 +0000 UTC m=+2178.926107282" watchObservedRunningTime="2026-04-17 18:02:42.253606818 +0000 UTC m=+2178.927460154" Apr 17 18:02:43.755988 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:43.755952 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rr5tw_5f010296-632c-435e-b1db-62d8eeeae050/dns/0.log" Apr 17 18:02:43.780767 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:43.780733 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rr5tw_5f010296-632c-435e-b1db-62d8eeeae050/kube-rbac-proxy/0.log" Apr 17 18:02:43.921338 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:43.921313 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zdmch_e5ac259a-b271-4b9e-9b77-a8a0b4118c19/dns-node-resolver/0.log" Apr 17 18:02:44.490314 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:44.490274 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rfmdv_09957151-db22-4dfb-9042-61ea1fff6d0b/node-ca/0.log" Apr 17 18:02:45.475180 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:45.475147 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfqvtds_d79f2435-2abc-482c-a314-824069d12177/istio-proxy/0.log" Apr 17 18:02:45.734981 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:45.734897 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-p962x_30a1fd64-c3a6-42fe-a9d7-f0869357a3d0/istio-proxy/0.log" Apr 17 18:02:46.382210 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:46.382174 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zflhg_d55bc00d-c046-4764-9cfb-801efb7b23b8/serve-healthcheck-canary/0.log" Apr 17 18:02:46.916836 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:46.916804 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2kgc2_707dd6a1-ae31-44f2-b222-f3bb7a67e699/kube-rbac-proxy/0.log" Apr 17 18:02:46.944357 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:46.944327 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2kgc2_707dd6a1-ae31-44f2-b222-f3bb7a67e699/exporter/0.log" Apr 17 18:02:46.972482 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:46.972455 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2kgc2_707dd6a1-ae31-44f2-b222-f3bb7a67e699/extractor/0.log" Apr 17 18:02:48.252114 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:48.252086 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-wp8tz" Apr 17 18:02:49.422778 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:49.422741 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-77fb85d776-wqcvj_72f44ff3-707a-444a-bfc9-5dd333f3568a/manager/0.log" Apr 17 18:02:49.449273 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:49.449242 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-j4tff_9f9e55fa-b286-428a-a040-5ea7dd918f33/postgres/0.log" Apr 17 18:02:50.791347 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:50.791316 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6f45766749-wnjd5_6318e701-9cfb-4ba7-a8ea-74909ddba5dd/manager/0.log" Apr 17 18:02:57.490370 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:57.490337 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdxz9_5d2765e9-81b0-417d-9da1-edd5712b0fc4/kube-multus-additional-cni-plugins/0.log" Apr 17 18:02:57.514605 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:57.514579 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdxz9_5d2765e9-81b0-417d-9da1-edd5712b0fc4/egress-router-binary-copy/0.log" Apr 17 18:02:57.538478 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:57.538452 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdxz9_5d2765e9-81b0-417d-9da1-edd5712b0fc4/cni-plugins/0.log" Apr 17 18:02:57.563096 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:57.563062 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdxz9_5d2765e9-81b0-417d-9da1-edd5712b0fc4/bond-cni-plugin/0.log" Apr 17 18:02:57.586535 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:57.586510 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdxz9_5d2765e9-81b0-417d-9da1-edd5712b0fc4/routeoverride-cni/0.log" Apr 17 18:02:57.615132 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:57.615107 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdxz9_5d2765e9-81b0-417d-9da1-edd5712b0fc4/whereabouts-cni-bincopy/0.log" Apr 17 18:02:57.643017 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:57.642995 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdxz9_5d2765e9-81b0-417d-9da1-edd5712b0fc4/whereabouts-cni/0.log" Apr 17 18:02:57.921467 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:57.921397 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-glbqg_a9fa17fb-8628-4f11-aa6a-a17be56e125d/kube-multus/0.log" Apr 17 18:02:57.967877 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:57.967846 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fczvt_8f82208d-ba82-434a-a209-d69847b4e54b/network-metrics-daemon/0.log" Apr 17 18:02:57.985172 ip-10-0-138-42 kubenswrapper[2571]: I0417 18:02:57.985127 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fczvt_8f82208d-ba82-434a-a209-d69847b4e54b/kube-rbac-proxy/0.log"