Apr 20 14:26:55.647481 ip-10-0-140-30 systemd[1]: Starting Kubernetes Kubelet... Apr 20 14:26:56.194933 ip-10-0-140-30 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:26:56.194933 ip-10-0-140-30 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 14:26:56.194933 ip-10-0-140-30 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:26:56.194933 ip-10-0-140-30 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 14:26:56.194933 ip-10-0-140-30 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:26:56.197465 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.197362 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 14:26:56.200872 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200852 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:26:56.200872 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200872 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200875 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200879 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200883 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200886 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200889 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200893 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200896 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200900 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200903 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200906 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200908 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200911 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200914 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200916 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200919 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200921 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200924 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200927 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200930 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:26:56.200942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200932 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200937 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200941 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200944 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200947 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200950 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200954 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200957 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200960 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200963 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200965 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200968 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200971 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200975 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200978 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200980 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200983 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200988 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200990 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:26:56.201428 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200993 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200996 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.200999 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201002 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201004 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201007 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201010 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201013 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201015 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201017 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201020 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201022 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201025 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201027 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201030 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201033 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201036 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201039 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201042 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201044 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:26:56.201922 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201047 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201050 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201052 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201055 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201057 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201060 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201064 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201066 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201069 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201072 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201074 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201077 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201080 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201082 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201085 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201087 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201092 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201096 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201098 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201101 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:26:56.202427 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201104 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201106 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201109 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201112 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201114 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201116 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201531 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201538 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201541 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201544 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201547 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201549 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201552 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201555 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201557 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201560 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201562 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201566 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201570 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201572 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:26:56.202913 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201575 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201579 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201581 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201584 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201587 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201590 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201592 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201595 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201597 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201600 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201603 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201605 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201608 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201611 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201614 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201616 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201618 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201621 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201624 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201627 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:26:56.203551 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201630 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201633 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201635 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201638 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201642 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201645 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201648 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201651 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201654 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201658 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201660 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201664 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201666 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201669 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201671 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201674 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201677 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201679 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201682 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:26:56.204092 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201684 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201687 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201689 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201692 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201694 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201696 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201699 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201702 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201704 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201707 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201711 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201714 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201717 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201720 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201722 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201725 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201727 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201730 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201732 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201735 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:26:56.204580 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201737 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201740 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201743 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201745 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201747 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201750 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201752 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201755 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201757 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201760 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201762 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201765 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.201767 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203861 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203872 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203881 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203886 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203891 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203894 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203899 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203904 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 14:26:56.205074 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203908 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203911 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203915 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203919 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203922 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203925 2572 flags.go:64] FLAG: --cgroup-root="" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203929 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203932 2572 flags.go:64] FLAG: --client-ca-file="" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203935 2572 flags.go:64] FLAG: --cloud-config="" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203938 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203941 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203946 2572 flags.go:64] FLAG: --cluster-domain="" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203949 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203951 2572 flags.go:64] FLAG: --config-dir="" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203954 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203958 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203962 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203965 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203968 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203972 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203975 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203978 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203982 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203985 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203988 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 14:26:56.205601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203993 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203996 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.203999 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204002 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204005 2572 flags.go:64] FLAG: --enable-server="true" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204008 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204014 2572 flags.go:64] FLAG: --event-burst="100" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204017 2572 flags.go:64] FLAG: --event-qps="50" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204020 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204024 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204027 2572 flags.go:64] FLAG: --eviction-hard="" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204032 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204035 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204038 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204041 2572 flags.go:64] FLAG: --eviction-soft="" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204044 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204047 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204050 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204053 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204056 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204059 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204062 2572 flags.go:64] FLAG: --feature-gates="" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204066 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204070 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204073 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 14:26:56.206229 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204076 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204079 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204082 2572 flags.go:64] FLAG: --help="false" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204085 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204089 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204093 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204096 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204100 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204103 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204106 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204109 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204112 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204115 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204119 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204133 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204137 2572 flags.go:64] FLAG: --kube-reserved="" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204140 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204143 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204146 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204149 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204152 2572 flags.go:64] FLAG: --lock-file="" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204155 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204158 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204162 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 14:26:56.206870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204167 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204170 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204173 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204176 2572 flags.go:64] FLAG: --logging-format="text" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204179 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204182 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204185 2572 flags.go:64] FLAG: --manifest-url="" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204188 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204192 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204196 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204200 2572 flags.go:64] FLAG: --max-pods="110" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204203 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204206 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204209 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204212 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204215 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204219 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204221 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204231 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204234 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204237 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204241 2572 flags.go:64] FLAG: --pod-cidr="" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204244 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 14:26:56.207464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204250 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204254 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204257 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204260 2572 flags.go:64] FLAG: --port="10250" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204264 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204267 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0da0337688301ee4a" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204270 2572 flags.go:64] FLAG: --qos-reserved="" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204273 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204276 2572 flags.go:64] FLAG: --register-node="true" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204279 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204283 2572 flags.go:64] FLAG: --register-with-taints="" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204286 2572 flags.go:64] FLAG: --registry-burst="10" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204289 2572 flags.go:64] FLAG: --registry-qps="5" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204292 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204295 2572 flags.go:64] FLAG: --reserved-memory="" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204299 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204302 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204305 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204308 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204311 2572 flags.go:64] FLAG: --runonce="false" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204314 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204317 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204321 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204324 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204327 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204330 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 14:26:56.208021 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204334 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204337 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204339 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204343 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204345 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204349 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204352 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204358 2572 flags.go:64] FLAG: --system-cgroups="" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204361 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204366 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204369 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204372 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204376 2572 flags.go:64] FLAG: --tls-min-version="" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204379 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204382 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204385 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204388 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204392 2572 flags.go:64] FLAG: --v="2" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204396 2572 flags.go:64] FLAG: --version="false" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204400 2572 flags.go:64] FLAG: --vmodule="" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204405 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.204409 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204505 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204509 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:26:56.208659 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204513 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204516 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204519 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204522 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204525 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204528 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204531 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204534 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204537 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204540 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204543 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204546 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204548 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204551 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204553 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204558 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204561 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204563 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204566 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204568 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:26:56.209247 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204571 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204573 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204576 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204578 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204581 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204584 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204587 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204590 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204593 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204596 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204598 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204601 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204604 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204606 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204609 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204611 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204614 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204616 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204619 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204621 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:26:56.209787 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204624 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204627 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204630 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204632 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204635 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204640 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204643 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204646 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204649 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204653 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204657 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204661 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204664 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204668 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204670 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204673 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204675 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204678 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204681 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204684 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:26:56.210292 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204686 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204689 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204691 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204694 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204697 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204699 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204702 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204704 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204707 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204710 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204712 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204715 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204718 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204721 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204724 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204726 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204730 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204735 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204738 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:26:56.210782 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204741 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:26:56.211298 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204744 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:26:56.211298 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204747 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:26:56.211298 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204750 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:26:56.211298 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.204753 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:26:56.211298 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.205345 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:26:56.214663 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.214638 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 14:26:56.214663 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.214661 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214721 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214727 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214730 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214733 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214737 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214740 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214742 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214745 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214748 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214751 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214754 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214757 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214759 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214762 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214764 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214767 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214769 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214772 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214774 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:26:56.214791 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214777 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214779 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214783 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214785 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214788 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214791 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214794 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214797 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214800 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214803 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214805 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214808 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214810 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214813 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214816 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214819 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214821 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214824 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214826 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214829 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:26:56.215308 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214831 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214833 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214836 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214838 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214841 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214843 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214846 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214848 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214851 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214854 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214856 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214859 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214861 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214863 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214867 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214872 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214877 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214880 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214883 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:26:56.215810 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214887 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214890 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214893 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214896 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214899 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214901 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214904 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214906 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214909 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214912 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214914 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214917 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214919 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214922 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214924 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214927 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214930 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214932 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214935 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:26:56.216293 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214939 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:26:56.216800 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214941 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:26:56.216800 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214944 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:26:56.216800 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214947 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:26:56.216800 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214949 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:26:56.216800 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214952 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:26:56.216800 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214954 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:26:56.216800 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214957 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:26:56.216800 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.214960 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:26:56.216800 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.214966 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:26:56.216800 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215069 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:26:56.216800 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215075 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:26:56.216800 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215078 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:26:56.216800 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215081 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:26:56.216800 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215084 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:26:56.216800 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215087 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:26:56.216800 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215090 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215093 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215096 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215098 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215101 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215104 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215107 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215110 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215112 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215115 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215119 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215136 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215140 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215143 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215146 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215149 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215154 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215157 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215160 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:26:56.217227 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215162 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215165 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215168 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215170 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215174 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215178 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215182 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215185 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215188 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215191 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215194 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215197 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215199 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215202 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215204 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215207 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215210 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215212 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215215 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:26:56.217703 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215217 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215220 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215222 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215225 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215228 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215230 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215233 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215235 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215238 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215240 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215243 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215252 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215255 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215257 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215260 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215262 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215265 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215268 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215270 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215273 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:26:56.218189 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215275 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215278 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215280 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215283 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215285 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215288 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215290 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215293 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215295 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215297 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215300 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215302 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215305 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215307 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215309 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215312 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215315 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215318 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215320 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215323 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:26:56.218672 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215325 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:26:56.219178 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:56.215328 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:26:56.219178 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.215333 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:26:56.219178 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.216154 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 14:26:56.221487 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.221471 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 14:26:56.222555 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.222542 2572 server.go:1019] "Starting client certificate rotation" Apr 20 14:26:56.222658 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.222641 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:26:56.222693 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.222686 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:26:56.255832 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.255806 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:26:56.260069 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.260031 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:26:56.279369 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.279342 2572 log.go:25] "Validated CRI v1 runtime API" Apr 20 14:26:56.286028 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.286004 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:26:56.286259 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.286242 2572 log.go:25] "Validated CRI v1 image API" Apr 20 14:26:56.287517 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.287501 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 14:26:56.293660 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.293639 2572 fs.go:135] Filesystem UUIDs: map[370cbdc2-b058-4aac-a412-bfcbb32fceaa:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 f923fa25-f10c-404b-b70b-e572bd40eb25:/dev/nvme0n1p4] Apr 20 14:26:56.293721 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.293660 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 14:26:56.300751 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.300617 2572 manager.go:217] Machine: {Timestamp:2026-04-20 14:26:56.29848389 +0000 UTC m=+0.505476363 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101368 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29e4360fd52c7b9cfdcf6f03e4b62d SystemUUID:ec29e436-0fd5-2c7b-9cfd-cf6f03e4b62d BootID:0befc784-91e2-47f2-9688-b06521163d6b Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:4e:cf:09:fa:43 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:4e:cf:09:fa:43 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:56:59:00:cd:93:e0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 14:26:56.300751 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.300735 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 14:26:56.300909 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.300832 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 14:26:56.302014 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.301984 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 14:26:56.302176 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.302016 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-30.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 14:26:56.302220 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.302187 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 14:26:56.302220 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.302196 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 14:26:56.302220 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.302210 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:26:56.303059 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.303049 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:26:56.304505 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.304494 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:26:56.304624 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.304614 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 14:26:56.305797 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.305782 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mdq4n" Apr 20 14:26:56.307556 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.307544 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 20 14:26:56.307587 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.307569 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 14:26:56.307587 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.307586 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 14:26:56.307652 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.307596 2572 kubelet.go:397] "Adding apiserver pod source" Apr 20 14:26:56.307652 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.307605 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 14:26:56.308631 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.308619 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:26:56.308668 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.308639 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:26:56.311849 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.311831 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mdq4n" Apr 20 14:26:56.312806 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.312778 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 14:26:56.314448 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.314434 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 14:26:56.316278 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.316267 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 14:26:56.316330 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.316283 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 14:26:56.316330 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.316290 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 14:26:56.316330 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.316296 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 14:26:56.316330 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.316301 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 14:26:56.316330 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.316308 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 14:26:56.316330 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.316314 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 14:26:56.316330 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.316319 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 14:26:56.316330 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.316326 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 14:26:56.316330 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.316332 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 14:26:56.316563 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.316342 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 14:26:56.316563 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.316351 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 14:26:56.320517 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.320499 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 14:26:56.320517 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.320516 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 14:26:56.320770 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.320752 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:26:56.322301 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.322286 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:26:56.325165 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.325150 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 14:26:56.325260 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.325192 2572 server.go:1295] "Started kubelet" Apr 20 14:26:56.325335 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.325294 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 14:26:56.325393 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.325353 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 14:26:56.325393 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.325349 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 14:26:56.326154 ip-10-0-140-30 systemd[1]: Started Kubernetes Kubelet. Apr 20 14:26:56.326389 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.326372 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-30.ec2.internal" not found Apr 20 14:26:56.328193 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.328170 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 20 14:26:56.329934 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.329907 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 14:26:56.335337 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.335316 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 14:26:56.335441 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.335335 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 14:26:56.335931 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.335914 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 14:26:56.335931 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.335917 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 14:26:56.336053 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.335939 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 14:26:56.336053 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.336011 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 20 14:26:56.336053 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.336020 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 20 14:26:56.336266 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:56.336239 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-30.ec2.internal\" not found" Apr 20 14:26:56.337011 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.336995 2572 factory.go:55] Registering systemd factory Apr 20 14:26:56.337105 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.337047 2572 factory.go:223] Registration of the systemd container factory successfully Apr 20 14:26:56.337198 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.337165 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:26:56.337453 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.337438 2572 factory.go:153] Registering CRI-O factory Apr 20 14:26:56.337532 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.337456 2572 factory.go:223] Registration of the crio container factory successfully Apr 20 14:26:56.337532 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:56.337517 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 14:26:56.337532 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.337529 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 14:26:56.337694 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.337556 2572 factory.go:103] Registering Raw factory Apr 20 14:26:56.337694 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.337573 2572 manager.go:1196] Started watching for new ooms in manager Apr 20 14:26:56.338000 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.337987 2572 manager.go:319] Starting recovery of all containers Apr 20 14:26:56.340633 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:56.340611 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-140-30.ec2.internal\" not found" node="ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.342704 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.342682 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-30.ec2.internal" not found Apr 20 14:26:56.347738 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.347720 2572 manager.go:324] Recovery completed Apr 20 14:26:56.352567 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.352551 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:26:56.354739 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.354723 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-30.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:26:56.354828 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.354752 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-30.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:26:56.354828 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.354762 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-30.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:26:56.355304 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.355291 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 14:26:56.355360 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.355303 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 14:26:56.355360 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.355320 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:26:56.357734 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.357721 2572 policy_none.go:49] "None policy: Start" Apr 20 14:26:56.357781 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.357737 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 14:26:56.357781 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.357747 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 20 14:26:56.394838 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.394812 2572 manager.go:341] "Starting Device Plugin manager" Apr 20 14:26:56.399681 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:56.394851 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 14:26:56.399681 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.394865 2572 server.go:85] "Starting device plugin registration server" Apr 20 14:26:56.399681 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.395176 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 14:26:56.399681 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.395190 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 14:26:56.399681 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.395274 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 14:26:56.399681 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.395365 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 14:26:56.399681 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.395373 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 14:26:56.399681 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:56.396076 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 14:26:56.399681 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:56.396118 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-30.ec2.internal\" not found" Apr 20 14:26:56.402479 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.402459 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-30.ec2.internal" not found Apr 20 14:26:56.460645 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.460610 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 14:26:56.461887 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.461870 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 14:26:56.461983 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.461900 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 14:26:56.461983 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.461925 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 14:26:56.461983 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.461934 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 14:26:56.462106 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:56.462051 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 14:26:56.464288 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.464266 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:26:56.496094 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.496061 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:26:56.497170 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.497150 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-30.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:26:56.497474 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.497456 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-30.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:26:56.497547 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.497512 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-30.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:26:56.497681 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.497652 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.507312 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.507285 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.562595 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.562537 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-30.ec2.internal"] Apr 20 14:26:56.565178 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.565144 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.565325 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.565151 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.584821 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.584796 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.588913 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.588893 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.595314 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.595292 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:26:56.601276 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.601259 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:26:56.637180 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.637122 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1028cc4a1cd2bb7f24fa2491ce4493a5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal\" (UID: \"1028cc4a1cd2bb7f24fa2491ce4493a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.637348 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.637217 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1028cc4a1cd2bb7f24fa2491ce4493a5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal\" (UID: \"1028cc4a1cd2bb7f24fa2491ce4493a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.637348 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.637241 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c8ce0f8ff8898a2f19fd2c9a4c4f3273-config\") pod \"kube-apiserver-proxy-ip-10-0-140-30.ec2.internal\" (UID: \"c8ce0f8ff8898a2f19fd2c9a4c4f3273\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.737944 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.737844 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1028cc4a1cd2bb7f24fa2491ce4493a5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal\" (UID: \"1028cc4a1cd2bb7f24fa2491ce4493a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.737944 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.737886 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1028cc4a1cd2bb7f24fa2491ce4493a5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal\" (UID: \"1028cc4a1cd2bb7f24fa2491ce4493a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.737944 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.737911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c8ce0f8ff8898a2f19fd2c9a4c4f3273-config\") pod \"kube-apiserver-proxy-ip-10-0-140-30.ec2.internal\" (UID: \"c8ce0f8ff8898a2f19fd2c9a4c4f3273\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.738210 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.737958 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c8ce0f8ff8898a2f19fd2c9a4c4f3273-config\") pod \"kube-apiserver-proxy-ip-10-0-140-30.ec2.internal\" (UID: \"c8ce0f8ff8898a2f19fd2c9a4c4f3273\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.738210 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.737965 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1028cc4a1cd2bb7f24fa2491ce4493a5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal\" (UID: \"1028cc4a1cd2bb7f24fa2491ce4493a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.738210 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.737965 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1028cc4a1cd2bb7f24fa2491ce4493a5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal\" (UID: \"1028cc4a1cd2bb7f24fa2491ce4493a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.898785 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.898744 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-30.ec2.internal" Apr 20 14:26:56.904392 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:56.904372 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal" Apr 20 14:26:57.222627 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.222431 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 14:26:57.223332 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.222759 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 14:26:57.223332 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.222792 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 14:26:57.223332 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.222829 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 14:26:57.308175 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.308142 2572 apiserver.go:52] "Watching apiserver" Apr 20 14:26:57.314296 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.314236 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 14:21:56 +0000 UTC" deadline="2027-09-13 16:22:11.551517334 +0000 UTC" Apr 20 14:26:57.314296 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.314285 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12265h55m14.237235719s" Apr 20 14:26:57.320611 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.320588 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 14:26:57.321668 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.321646 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-9kgc2","openshift-network-diagnostics/network-check-target-dgqlp","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb","openshift-dns/node-resolver-w77dh","openshift-multus/network-metrics-daemon-g7tjl","openshift-network-operator/iptables-alerter-52mwl","openshift-ovn-kubernetes/ovnkube-node-x29sq","kube-system/konnectivity-agent-p697p","kube-system/kube-apiserver-proxy-ip-10-0-140-30.ec2.internal","openshift-cluster-node-tuning-operator/tuned-vw667","openshift-image-registry/node-ca-f759k","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal","openshift-multus/multus-9gnvx"] Apr 20 14:26:57.324492 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.324470 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.324541 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.324490 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:26:57.324573 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:57.324556 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgqlp" podUID="e64bf075-31c6-4e04-b3fc-8dddfabaff6a" Apr 20 14:26:57.325920 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.325900 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.326961 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.326935 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 14:26:57.327060 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.326935 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 14:26:57.327141 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.327114 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mmqxz\"" Apr 20 14:26:57.327273 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.327258 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 14:26:57.327344 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.327284 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w77dh" Apr 20 14:26:57.327545 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.327527 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 14:26:57.327877 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.327864 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 14:26:57.328181 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.328165 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 14:26:57.328764 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.328735 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 14:26:57.328764 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.328749 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 14:26:57.328900 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.328736 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rfgjt\"" Apr 20 14:26:57.329803 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.329783 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 14:26:57.329905 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.329804 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 14:26:57.329987 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.329971 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qhcng\"" Apr 20 14:26:57.330536 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.330519 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:26:57.331214 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:57.330928 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7tjl" podUID="19b3975f-609f-427a-a428-db9cb8176eec" Apr 20 14:26:57.331214 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.331179 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-52mwl" Apr 20 14:26:57.333413 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.333389 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8skvl\"" Apr 20 14:26:57.333828 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.333809 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.333973 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.333944 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 14:26:57.334351 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.334330 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:26:57.334464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.334444 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 14:26:57.335419 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.335402 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 14:26:57.335689 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.335663 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-p697p" Apr 20 14:26:57.336538 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.336519 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 14:26:57.337108 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.337091 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.337710 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.337667 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 14:26:57.337710 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.337706 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 14:26:57.337857 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.337714 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-mgnbv\"" Apr 20 14:26:57.337857 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.337708 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 14:26:57.337857 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.337709 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 14:26:57.338093 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.338074 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-bwrw8\"" Apr 20 14:26:57.338167 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.338107 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 14:26:57.338167 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.338139 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 14:26:57.338167 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.338161 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 14:26:57.338480 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.338465 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-f759k" Apr 20 14:26:57.339221 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.339201 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:26:57.339306 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.339240 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-25znw\"" Apr 20 14:26:57.339306 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.339261 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 14:26:57.339677 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.339658 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.340867 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.340609 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 14:26:57.340867 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.340685 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 14:26:57.340867 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.340691 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 14:26:57.340867 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.340702 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-lslm2\"" Apr 20 14:26:57.341666 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.341651 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mp542\"" Apr 20 14:26:57.341739 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.341654 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 14:26:57.341960 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.341943 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-etc-selinux\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.342010 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.341977 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-run-netns\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.342061 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342003 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6vvg\" (UniqueName: \"kubernetes.io/projected/a649d03f-abf5-42ca-849e-903f5fdc0299-kube-api-access-b6vvg\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.342061 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342049 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-systemd-units\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.342147 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342065 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-run-ovn-kubernetes\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.342147 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342097 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/26818813-da84-407b-b55f-77d9ffcbb474-hosts-file\") pod \"node-resolver-w77dh\" (UID: \"26818813-da84-407b-b55f-77d9ffcbb474\") " pod="openshift-dns/node-resolver-w77dh" Apr 20 14:26:57.342147 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342140 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/26818813-da84-407b-b55f-77d9ffcbb474-tmp-dir\") pod \"node-resolver-w77dh\" (UID: \"26818813-da84-407b-b55f-77d9ffcbb474\") " pod="openshift-dns/node-resolver-w77dh" Apr 20 14:26:57.342253 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342170 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9brxs\" (UniqueName: \"kubernetes.io/projected/26818813-da84-407b-b55f-77d9ffcbb474-kube-api-access-9brxs\") pod \"node-resolver-w77dh\" (UID: \"26818813-da84-407b-b55f-77d9ffcbb474\") " pod="openshift-dns/node-resolver-w77dh" Apr 20 14:26:57.342253 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342195 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-node-log\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.342253 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342223 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck85m\" (UniqueName: \"kubernetes.io/projected/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-kube-api-access-ck85m\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.342253 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342248 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-systemd\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.342438 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342271 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-lib-modules\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.342438 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342295 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-device-dir\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.342438 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342320 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-run-ovn\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.342438 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342345 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-cni-binary-copy\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.342438 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342387 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/35778b02-a4c0-4a27-907b-2c96d1273465-konnectivity-ca\") pod \"konnectivity-agent-p697p\" (UID: \"35778b02-a4c0-4a27-907b-2c96d1273465\") " pod="kube-system/konnectivity-agent-p697p" Apr 20 14:26:57.342602 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342435 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-sysctl-conf\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.342602 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342479 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhnxs\" (UniqueName: \"kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs\") pod \"network-check-target-dgqlp\" (UID: \"e64bf075-31c6-4e04-b3fc-8dddfabaff6a\") " pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:26:57.342602 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342521 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs\") pod \"network-metrics-daemon-g7tjl\" (UID: \"19b3975f-609f-427a-a428-db9cb8176eec\") " pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:26:57.342602 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342558 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/79f008e1-fae4-4db2-bdc8-eb46cf52e662-iptables-alerter-script\") pod \"iptables-alerter-52mwl\" (UID: \"79f008e1-fae4-4db2-bdc8-eb46cf52e662\") " pod="openshift-network-operator/iptables-alerter-52mwl" Apr 20 14:26:57.342602 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342593 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxprp\" (UniqueName: \"kubernetes.io/projected/79f008e1-fae4-4db2-bdc8-eb46cf52e662-kube-api-access-dxprp\") pod \"iptables-alerter-52mwl\" (UID: \"79f008e1-fae4-4db2-bdc8-eb46cf52e662\") " pod="openshift-network-operator/iptables-alerter-52mwl" Apr 20 14:26:57.342838 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342619 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-system-cni-dir\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.342838 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342665 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-sys-fs\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.342838 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342696 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79f008e1-fae4-4db2-bdc8-eb46cf52e662-host-slash\") pod \"iptables-alerter-52mwl\" (UID: \"79f008e1-fae4-4db2-bdc8-eb46cf52e662\") " pod="openshift-network-operator/iptables-alerter-52mwl" Apr 20 14:26:57.342838 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342721 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-etc-openvswitch\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.342838 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342746 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-cni-netd\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.342838 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342774 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.342838 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342799 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a649d03f-abf5-42ca-849e-903f5fdc0299-ovnkube-script-lib\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.343213 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342848 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-run\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.343213 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342873 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-tuned\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.343213 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342895 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-log-socket\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.343213 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342922 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-sysconfig\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.343213 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342944 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-kubernetes\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.343213 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342967 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a649d03f-abf5-42ca-849e-903f5fdc0299-ovnkube-config\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.343213 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.342991 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a649d03f-abf5-42ca-849e-903f5fdc0299-env-overrides\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.343213 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343016 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-var-lib-openvswitch\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.343213 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343038 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-cni-bin\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.343213 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343076 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-sysctl-d\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.343213 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343099 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9676c2f1-f258-4d20-af54-415ffdb94e09-tmp\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.343213 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343138 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-registration-dir\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.343213 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343163 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98dcv\" (UniqueName: \"kubernetes.io/projected/19b3975f-609f-427a-a428-db9cb8176eec-kube-api-access-98dcv\") pod \"network-metrics-daemon-g7tjl\" (UID: \"19b3975f-609f-427a-a428-db9cb8176eec\") " pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:26:57.343213 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343193 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-slash\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.343822 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343230 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.343822 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343253 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-run-systemd\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.343822 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343268 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-host\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.343822 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343287 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-cnibin\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.343822 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343308 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-run-openvswitch\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.343822 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343323 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.343822 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343339 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.343822 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343359 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-sys\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.343822 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343386 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qqdx\" (UniqueName: \"kubernetes.io/projected/e1834aa4-27df-4840-ae0e-2df474de8c48-kube-api-access-6qqdx\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.343822 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343428 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-kubelet\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.343822 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343453 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a649d03f-abf5-42ca-849e-903f5fdc0299-ovn-node-metrics-cert\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.343822 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343479 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-os-release\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.343822 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343511 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.343822 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343536 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/35778b02-a4c0-4a27-907b-2c96d1273465-agent-certs\") pod \"konnectivity-agent-p697p\" (UID: \"35778b02-a4c0-4a27-907b-2c96d1273465\") " pod="kube-system/konnectivity-agent-p697p" Apr 20 14:26:57.343822 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343572 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-modprobe-d\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.343822 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343596 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-var-lib-kubelet\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.344325 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343620 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj9rc\" (UniqueName: \"kubernetes.io/projected/9676c2f1-f258-4d20-af54-415ffdb94e09-kube-api-access-sj9rc\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.344325 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.343670 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-socket-dir\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.348490 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.348470 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:26:57.368637 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.368612 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9j25p" Apr 20 14:26:57.374620 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.374598 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9j25p" Apr 20 14:26:57.381116 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:57.381078 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8ce0f8ff8898a2f19fd2c9a4c4f3273.slice/crio-8aee5180a1845781d5e0dacb672d58e03fe114b2e1ffa3b20851a34a8efa2720 WatchSource:0}: Error finding container 8aee5180a1845781d5e0dacb672d58e03fe114b2e1ffa3b20851a34a8efa2720: Status 404 returned error can't find the container with id 8aee5180a1845781d5e0dacb672d58e03fe114b2e1ffa3b20851a34a8efa2720 Apr 20 14:26:57.387988 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.387973 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:26:57.395942 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:57.395916 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1028cc4a1cd2bb7f24fa2491ce4493a5.slice/crio-72ce536253da942013cd3049f1c5a0889826dc5b64e77ee2d849cb1eca1e3a2e WatchSource:0}: Error finding container 72ce536253da942013cd3049f1c5a0889826dc5b64e77ee2d849cb1eca1e3a2e: Status 404 returned error can't find the container with id 72ce536253da942013cd3049f1c5a0889826dc5b64e77ee2d849cb1eca1e3a2e Apr 20 14:26:57.436933 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.436828 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 14:26:57.444763 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.444739 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-os-release\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.444842 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.444777 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-sys-fs\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.444842 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.444793 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79f008e1-fae4-4db2-bdc8-eb46cf52e662-host-slash\") pod \"iptables-alerter-52mwl\" (UID: \"79f008e1-fae4-4db2-bdc8-eb46cf52e662\") " pod="openshift-network-operator/iptables-alerter-52mwl" Apr 20 14:26:57.444959 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.444838 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-hostroot\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.444959 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.444860 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/64d6d293-736d-4b7e-86ca-9b3decf1c068-multus-daemon-config\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.444959 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.444895 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-etc-openvswitch\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.444959 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.444919 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-cni-netd\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.444959 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.444943 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-etc-openvswitch\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.444959 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.444945 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.445256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.444954 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79f008e1-fae4-4db2-bdc8-eb46cf52e662-host-slash\") pod \"iptables-alerter-52mwl\" (UID: \"79f008e1-fae4-4db2-bdc8-eb46cf52e662\") " pod="openshift-network-operator/iptables-alerter-52mwl" Apr 20 14:26:57.445256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.444972 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a649d03f-abf5-42ca-849e-903f5fdc0299-ovnkube-script-lib\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.445256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.444993 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-cni-netd\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.445256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445014 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-run\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.445256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445027 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.445256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445041 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-tuned\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.445256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445033 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-sys-fs\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.445256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445079 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-log-socket\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.445256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445107 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-sysconfig\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.445256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445152 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-kubernetes\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.445256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445178 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a649d03f-abf5-42ca-849e-903f5fdc0299-ovnkube-config\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.445256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445196 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a649d03f-abf5-42ca-849e-903f5fdc0299-env-overrides\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.445256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445218 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-sysconfig\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.445256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445179 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-log-socket\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.445256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445224 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-run-k8s-cni-cncf-io\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.445256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445265 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-run\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.445896 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445300 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-kubernetes\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.445896 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445312 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-var-lib-openvswitch\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.445896 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445387 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-cni-bin\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.445896 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445388 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-var-lib-openvswitch\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.445896 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445429 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztxlg\" (UniqueName: \"kubernetes.io/projected/db0116b2-344f-40ec-a0a2-b47e8cb06248-kube-api-access-ztxlg\") pod \"node-ca-f759k\" (UID: \"db0116b2-344f-40ec-a0a2-b47e8cb06248\") " pod="openshift-image-registry/node-ca-f759k" Apr 20 14:26:57.445896 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445458 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-cni-bin\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.445896 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445460 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-sysctl-d\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.445896 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445515 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9676c2f1-f258-4d20-af54-415ffdb94e09-tmp\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.445896 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445541 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-registration-dir\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.445896 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445566 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 14:26:57.445896 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445588 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-sysctl-d\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.445896 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445573 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98dcv\" (UniqueName: \"kubernetes.io/projected/19b3975f-609f-427a-a428-db9cb8176eec-kube-api-access-98dcv\") pod \"network-metrics-daemon-g7tjl\" (UID: \"19b3975f-609f-427a-a428-db9cb8176eec\") " pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:26:57.445896 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445616 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-registration-dir\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.445896 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445638 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-slash\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.445896 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445669 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-slash\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.445896 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445669 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-var-lib-cni-bin\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.445896 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445671 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a649d03f-abf5-42ca-849e-903f5fdc0299-ovnkube-script-lib\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.446549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445713 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.446549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445741 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-run-systemd\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.446549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445751 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.446549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445756 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a649d03f-abf5-42ca-849e-903f5fdc0299-env-overrides\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.446549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445769 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-run-netns\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.446549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445715 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a649d03f-abf5-42ca-849e-903f5fdc0299-ovnkube-config\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.446549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445778 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-run-systemd\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.446549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445804 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-host\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.446549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445830 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-cnibin\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.446549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445844 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-host\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.446549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445859 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db0116b2-344f-40ec-a0a2-b47e8cb06248-host\") pod \"node-ca-f759k\" (UID: \"db0116b2-344f-40ec-a0a2-b47e8cb06248\") " pod="openshift-image-registry/node-ca-f759k" Apr 20 14:26:57.446549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445876 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-cnibin\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.446549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445886 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-run-openvswitch\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.446549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445913 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.446549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445933 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-run-openvswitch\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.446549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.445962 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.446549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446003 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-var-lib-cni-multus\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.447458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446028 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-sys\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.447458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446054 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qqdx\" (UniqueName: \"kubernetes.io/projected/e1834aa4-27df-4840-ae0e-2df474de8c48-kube-api-access-6qqdx\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.447458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446077 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.447458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446105 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-sys\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.447458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446149 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-kubelet\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.447458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446173 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a649d03f-abf5-42ca-849e-903f5fdc0299-ovn-node-metrics-cert\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.447458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446189 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-os-release\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.447458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446205 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.447458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446229 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-system-cni-dir\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.447458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446252 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-multus-cni-dir\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.447458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446275 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/35778b02-a4c0-4a27-907b-2c96d1273465-agent-certs\") pod \"konnectivity-agent-p697p\" (UID: \"35778b02-a4c0-4a27-907b-2c96d1273465\") " pod="kube-system/konnectivity-agent-p697p" Apr 20 14:26:57.447458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-modprobe-d\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.447458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446338 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-var-lib-kubelet\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.447458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446359 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sj9rc\" (UniqueName: \"kubernetes.io/projected/9676c2f1-f258-4d20-af54-415ffdb94e09-kube-api-access-sj9rc\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.447458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446379 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-socket-dir\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.447458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446400 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-etc-selinux\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.447458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446422 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-run-netns\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.448236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446443 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6vvg\" (UniqueName: \"kubernetes.io/projected/a649d03f-abf5-42ca-849e-903f5fdc0299-kube-api-access-b6vvg\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.448236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446495 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-var-lib-kubelet\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.448236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446503 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.448236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446531 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-systemd-units\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.448236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446555 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-kubelet\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.448236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446556 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-run-ovn-kubernetes\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.448236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446576 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-etc-selinux\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.448236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446585 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-run-ovn-kubernetes\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.448236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446590 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-var-lib-kubelet\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.448236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446621 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-multus-conf-dir\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.448236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446681 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.448236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446739 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-modprobe-d\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.448236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446761 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-os-release\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.448236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446778 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-etc-kubernetes\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.448236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446800 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-host-run-netns\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.448236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446804 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4qgk\" (UniqueName: \"kubernetes.io/projected/64d6d293-736d-4b7e-86ca-9b3decf1c068-kube-api-access-z4qgk\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.448236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446851 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-run-multus-certs\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446894 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-systemd-units\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446916 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/26818813-da84-407b-b55f-77d9ffcbb474-hosts-file\") pod \"node-resolver-w77dh\" (UID: \"26818813-da84-407b-b55f-77d9ffcbb474\") " pod="openshift-dns/node-resolver-w77dh" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446944 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/26818813-da84-407b-b55f-77d9ffcbb474-tmp-dir\") pod \"node-resolver-w77dh\" (UID: \"26818813-da84-407b-b55f-77d9ffcbb474\") " pod="openshift-dns/node-resolver-w77dh" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.446966 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-socket-dir\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447015 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9brxs\" (UniqueName: \"kubernetes.io/projected/26818813-da84-407b-b55f-77d9ffcbb474-kube-api-access-9brxs\") pod \"node-resolver-w77dh\" (UID: \"26818813-da84-407b-b55f-77d9ffcbb474\") " pod="openshift-dns/node-resolver-w77dh" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447037 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/26818813-da84-407b-b55f-77d9ffcbb474-hosts-file\") pod \"node-resolver-w77dh\" (UID: \"26818813-da84-407b-b55f-77d9ffcbb474\") " pod="openshift-dns/node-resolver-w77dh" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447070 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-node-log\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447100 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ck85m\" (UniqueName: \"kubernetes.io/projected/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-kube-api-access-ck85m\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447146 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-cnibin\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447174 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-systemd\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447176 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-node-log\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447209 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-lib-modules\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447235 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-device-dir\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447242 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/26818813-da84-407b-b55f-77d9ffcbb474-tmp-dir\") pod \"node-resolver-w77dh\" (UID: \"26818813-da84-407b-b55f-77d9ffcbb474\") " pod="openshift-dns/node-resolver-w77dh" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447262 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-run-ovn\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447289 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-cni-binary-copy\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447316 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/db0116b2-344f-40ec-a0a2-b47e8cb06248-serviceca\") pod \"node-ca-f759k\" (UID: \"db0116b2-344f-40ec-a0a2-b47e8cb06248\") " pod="openshift-image-registry/node-ca-f759k" Apr 20 14:26:57.449016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447344 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64d6d293-736d-4b7e-86ca-9b3decf1c068-cni-binary-copy\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.449571 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447358 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-lib-modules\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.449571 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447391 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-multus-socket-dir-parent\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.449571 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447446 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e1834aa4-27df-4840-ae0e-2df474de8c48-device-dir\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.449571 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447453 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/35778b02-a4c0-4a27-907b-2c96d1273465-konnectivity-ca\") pod \"konnectivity-agent-p697p\" (UID: \"35778b02-a4c0-4a27-907b-2c96d1273465\") " pod="kube-system/konnectivity-agent-p697p" Apr 20 14:26:57.449571 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447480 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-sysctl-conf\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.449571 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447518 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnxs\" (UniqueName: \"kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs\") pod \"network-check-target-dgqlp\" (UID: \"e64bf075-31c6-4e04-b3fc-8dddfabaff6a\") " pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:26:57.449571 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447541 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs\") pod \"network-metrics-daemon-g7tjl\" (UID: \"19b3975f-609f-427a-a428-db9cb8176eec\") " pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:26:57.449571 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/79f008e1-fae4-4db2-bdc8-eb46cf52e662-iptables-alerter-script\") pod \"iptables-alerter-52mwl\" (UID: \"79f008e1-fae4-4db2-bdc8-eb46cf52e662\") " pod="openshift-network-operator/iptables-alerter-52mwl" Apr 20 14:26:57.449571 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447588 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxprp\" (UniqueName: \"kubernetes.io/projected/79f008e1-fae4-4db2-bdc8-eb46cf52e662-kube-api-access-dxprp\") pod \"iptables-alerter-52mwl\" (UID: \"79f008e1-fae4-4db2-bdc8-eb46cf52e662\") " pod="openshift-network-operator/iptables-alerter-52mwl" Apr 20 14:26:57.449571 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447611 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-system-cni-dir\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.449571 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447729 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-system-cni-dir\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.449571 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447842 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-cni-binary-copy\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.449571 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.447898 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-systemd\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.449571 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:57.447979 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:26:57.449571 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.448161 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/79f008e1-fae4-4db2-bdc8-eb46cf52e662-iptables-alerter-script\") pod \"iptables-alerter-52mwl\" (UID: \"79f008e1-fae4-4db2-bdc8-eb46cf52e662\") " pod="openshift-network-operator/iptables-alerter-52mwl" Apr 20 14:26:57.449571 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.448182 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a649d03f-abf5-42ca-849e-903f5fdc0299-run-ovn\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.449571 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:57.448266 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs podName:19b3975f-609f-427a-a428-db9cb8176eec nodeName:}" failed. No retries permitted until 2026-04-20 14:26:57.948219106 +0000 UTC m=+2.155211542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs") pod "network-metrics-daemon-g7tjl" (UID: "19b3975f-609f-427a-a428-db9cb8176eec") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:26:57.450037 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.448291 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-sysctl-conf\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.450037 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.448454 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/35778b02-a4c0-4a27-907b-2c96d1273465-konnectivity-ca\") pod \"konnectivity-agent-p697p\" (UID: \"35778b02-a4c0-4a27-907b-2c96d1273465\") " pod="kube-system/konnectivity-agent-p697p" Apr 20 14:26:57.450037 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.449292 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a649d03f-abf5-42ca-849e-903f5fdc0299-ovn-node-metrics-cert\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.450037 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.449392 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/35778b02-a4c0-4a27-907b-2c96d1273465-agent-certs\") pod \"konnectivity-agent-p697p\" (UID: \"35778b02-a4c0-4a27-907b-2c96d1273465\") " pod="kube-system/konnectivity-agent-p697p" Apr 20 14:26:57.450037 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.449532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9676c2f1-f258-4d20-af54-415ffdb94e09-etc-tuned\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.450037 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.449557 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9676c2f1-f258-4d20-af54-415ffdb94e09-tmp\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.463399 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:57.463377 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:26:57.463399 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:57.463401 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:26:57.463537 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:57.463412 2572 projected.go:194] Error preparing data for projected volume kube-api-access-nhnxs for pod openshift-network-diagnostics/network-check-target-dgqlp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:26:57.463537 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:57.463516 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs podName:e64bf075-31c6-4e04-b3fc-8dddfabaff6a nodeName:}" failed. No retries permitted until 2026-04-20 14:26:57.963492428 +0000 UTC m=+2.170484845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nhnxs" (UniqueName: "kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs") pod "network-check-target-dgqlp" (UID: "e64bf075-31c6-4e04-b3fc-8dddfabaff6a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:26:57.465172 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.465105 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-30.ec2.internal" event={"ID":"c8ce0f8ff8898a2f19fd2c9a4c4f3273","Type":"ContainerStarted","Data":"8aee5180a1845781d5e0dacb672d58e03fe114b2e1ffa3b20851a34a8efa2720"} Apr 20 14:26:57.466064 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.466044 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal" event={"ID":"1028cc4a1cd2bb7f24fa2491ce4493a5","Type":"ContainerStarted","Data":"72ce536253da942013cd3049f1c5a0889826dc5b64e77ee2d849cb1eca1e3a2e"} Apr 20 14:26:57.474726 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.474702 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj9rc\" (UniqueName: \"kubernetes.io/projected/9676c2f1-f258-4d20-af54-415ffdb94e09-kube-api-access-sj9rc\") pod \"tuned-vw667\" (UID: \"9676c2f1-f258-4d20-af54-415ffdb94e09\") " pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.474897 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.474868 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qqdx\" (UniqueName: \"kubernetes.io/projected/e1834aa4-27df-4840-ae0e-2df474de8c48-kube-api-access-6qqdx\") pod \"aws-ebs-csi-driver-node-jnnqb\" (UID: \"e1834aa4-27df-4840-ae0e-2df474de8c48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.475020 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.475003 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxprp\" (UniqueName: \"kubernetes.io/projected/79f008e1-fae4-4db2-bdc8-eb46cf52e662-kube-api-access-dxprp\") pod \"iptables-alerter-52mwl\" (UID: \"79f008e1-fae4-4db2-bdc8-eb46cf52e662\") " pod="openshift-network-operator/iptables-alerter-52mwl" Apr 20 14:26:57.475341 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.475318 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9brxs\" (UniqueName: \"kubernetes.io/projected/26818813-da84-407b-b55f-77d9ffcbb474-kube-api-access-9brxs\") pod \"node-resolver-w77dh\" (UID: \"26818813-da84-407b-b55f-77d9ffcbb474\") " pod="openshift-dns/node-resolver-w77dh" Apr 20 14:26:57.475606 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.475588 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck85m\" (UniqueName: \"kubernetes.io/projected/9798cc7a-2b24-4609-adaf-fd0cb6fa296b-kube-api-access-ck85m\") pod \"multus-additional-cni-plugins-9kgc2\" (UID: \"9798cc7a-2b24-4609-adaf-fd0cb6fa296b\") " pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.475684 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.475617 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6vvg\" (UniqueName: \"kubernetes.io/projected/a649d03f-abf5-42ca-849e-903f5fdc0299-kube-api-access-b6vvg\") pod \"ovnkube-node-x29sq\" (UID: \"a649d03f-abf5-42ca-849e-903f5fdc0299\") " pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.476034 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.476018 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98dcv\" (UniqueName: \"kubernetes.io/projected/19b3975f-609f-427a-a428-db9cb8176eec-kube-api-access-98dcv\") pod \"network-metrics-daemon-g7tjl\" (UID: \"19b3975f-609f-427a-a428-db9cb8176eec\") " pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:26:57.548633 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.548599 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-var-lib-kubelet\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.548633 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.548632 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-multus-conf-dir\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.548867 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.548651 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-etc-kubernetes\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.548867 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.548704 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4qgk\" (UniqueName: \"kubernetes.io/projected/64d6d293-736d-4b7e-86ca-9b3decf1c068-kube-api-access-z4qgk\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.548867 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.548718 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-var-lib-kubelet\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.548867 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.548730 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-run-multus-certs\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.548867 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.548758 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-etc-kubernetes\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.548867 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.548765 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-run-multus-certs\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.548867 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.548728 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-multus-conf-dir\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.548867 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.548791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-cnibin\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.548867 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.548862 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/db0116b2-344f-40ec-a0a2-b47e8cb06248-serviceca\") pod \"node-ca-f759k\" (UID: \"db0116b2-344f-40ec-a0a2-b47e8cb06248\") " pod="openshift-image-registry/node-ca-f759k" Apr 20 14:26:57.549301 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.548891 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-cnibin\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549301 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.548893 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64d6d293-736d-4b7e-86ca-9b3decf1c068-cni-binary-copy\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549301 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.548946 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-multus-socket-dir-parent\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549301 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549031 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-os-release\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549301 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549058 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-hostroot\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549301 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549080 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-multus-socket-dir-parent\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549301 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549086 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/64d6d293-736d-4b7e-86ca-9b3decf1c068-multus-daemon-config\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549301 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549150 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-hostroot\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549301 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549099 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-os-release\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549301 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549164 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-run-k8s-cni-cncf-io\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549301 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549199 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-run-k8s-cni-cncf-io\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549301 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549215 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztxlg\" (UniqueName: \"kubernetes.io/projected/db0116b2-344f-40ec-a0a2-b47e8cb06248-kube-api-access-ztxlg\") pod \"node-ca-f759k\" (UID: \"db0116b2-344f-40ec-a0a2-b47e8cb06248\") " pod="openshift-image-registry/node-ca-f759k" Apr 20 14:26:57.549301 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-var-lib-cni-bin\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549713 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-run-netns\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549713 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549324 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-var-lib-cni-bin\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549713 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549340 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db0116b2-344f-40ec-a0a2-b47e8cb06248-host\") pod \"node-ca-f759k\" (UID: \"db0116b2-344f-40ec-a0a2-b47e8cb06248\") " pod="openshift-image-registry/node-ca-f759k" Apr 20 14:26:57.549713 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549368 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-var-lib-cni-multus\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549713 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549379 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-run-netns\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549713 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549402 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-system-cni-dir\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549713 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549414 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/db0116b2-344f-40ec-a0a2-b47e8cb06248-serviceca\") pod \"node-ca-f759k\" (UID: \"db0116b2-344f-40ec-a0a2-b47e8cb06248\") " pod="openshift-image-registry/node-ca-f759k" Apr 20 14:26:57.549713 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549425 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db0116b2-344f-40ec-a0a2-b47e8cb06248-host\") pod \"node-ca-f759k\" (UID: \"db0116b2-344f-40ec-a0a2-b47e8cb06248\") " pod="openshift-image-registry/node-ca-f759k" Apr 20 14:26:57.549713 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549428 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-multus-cni-dir\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549713 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549475 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-host-var-lib-cni-multus\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549713 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549540 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-multus-cni-dir\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549713 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549553 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64d6d293-736d-4b7e-86ca-9b3decf1c068-system-cni-dir\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.549713 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.549571 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64d6d293-736d-4b7e-86ca-9b3decf1c068-cni-binary-copy\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.550178 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.550159 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/64d6d293-736d-4b7e-86ca-9b3decf1c068-multus-daemon-config\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.556868 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.556844 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4qgk\" (UniqueName: \"kubernetes.io/projected/64d6d293-736d-4b7e-86ca-9b3decf1c068-kube-api-access-z4qgk\") pod \"multus-9gnvx\" (UID: \"64d6d293-736d-4b7e-86ca-9b3decf1c068\") " pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.557279 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.557263 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztxlg\" (UniqueName: \"kubernetes.io/projected/db0116b2-344f-40ec-a0a2-b47e8cb06248-kube-api-access-ztxlg\") pod \"node-ca-f759k\" (UID: \"db0116b2-344f-40ec-a0a2-b47e8cb06248\") " pod="openshift-image-registry/node-ca-f759k" Apr 20 14:26:57.646917 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.646881 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9kgc2" Apr 20 14:26:57.653756 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:57.653722 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9798cc7a_2b24_4609_adaf_fd0cb6fa296b.slice/crio-c0167c635217ec58c8398691fdf119ef7f50a9bbac47430910cedf76ddb26cd1 WatchSource:0}: Error finding container c0167c635217ec58c8398691fdf119ef7f50a9bbac47430910cedf76ddb26cd1: Status 404 returned error can't find the container with id c0167c635217ec58c8398691fdf119ef7f50a9bbac47430910cedf76ddb26cd1 Apr 20 14:26:57.657798 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.655973 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" Apr 20 14:26:57.664876 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:57.664849 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1834aa4_27df_4840_ae0e_2df474de8c48.slice/crio-e6fd1bf2e7f9de39eff219141f08958b214e1f7badff0badf49a39acb39b21b7 WatchSource:0}: Error finding container e6fd1bf2e7f9de39eff219141f08958b214e1f7badff0badf49a39acb39b21b7: Status 404 returned error can't find the container with id e6fd1bf2e7f9de39eff219141f08958b214e1f7badff0badf49a39acb39b21b7 Apr 20 14:26:57.685039 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.685008 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w77dh" Apr 20 14:26:57.689701 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.689683 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-52mwl" Apr 20 14:26:57.691742 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:57.691718 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26818813_da84_407b_b55f_77d9ffcbb474.slice/crio-dd7172d44615505c940cf0a971f132609db6c2f46022779da33ed4bcee867460 WatchSource:0}: Error finding container dd7172d44615505c940cf0a971f132609db6c2f46022779da33ed4bcee867460: Status 404 returned error can't find the container with id dd7172d44615505c940cf0a971f132609db6c2f46022779da33ed4bcee867460 Apr 20 14:26:57.696267 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.696249 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:26:57.697691 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:57.697665 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79f008e1_fae4_4db2_bdc8_eb46cf52e662.slice/crio-8c6a53bc7369a13d8232b2b00681a1f8535605e709b36d44ff53f732bee10256 WatchSource:0}: Error finding container 8c6a53bc7369a13d8232b2b00681a1f8535605e709b36d44ff53f732bee10256: Status 404 returned error can't find the container with id 8c6a53bc7369a13d8232b2b00681a1f8535605e709b36d44ff53f732bee10256 Apr 20 14:26:57.703385 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:57.703358 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda649d03f_abf5_42ca_849e_903f5fdc0299.slice/crio-52d7ebe0e901ff206638cdbb57fb3452e67de6ffe1144dcd77d2a597da692f94 WatchSource:0}: Error finding container 52d7ebe0e901ff206638cdbb57fb3452e67de6ffe1144dcd77d2a597da692f94: Status 404 returned error can't find the container with id 52d7ebe0e901ff206638cdbb57fb3452e67de6ffe1144dcd77d2a597da692f94 Apr 20 14:26:57.704875 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.704850 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-p697p" Apr 20 14:26:57.710610 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.710593 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vw667" Apr 20 14:26:57.711806 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:57.711784 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35778b02_a4c0_4a27_907b_2c96d1273465.slice/crio-93df730b604912e41ec1aadcd438fd57bf08a550037552077f8d4a6036e4e459 WatchSource:0}: Error finding container 93df730b604912e41ec1aadcd438fd57bf08a550037552077f8d4a6036e4e459: Status 404 returned error can't find the container with id 93df730b604912e41ec1aadcd438fd57bf08a550037552077f8d4a6036e4e459 Apr 20 14:26:57.715873 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.715850 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-f759k" Apr 20 14:26:57.716458 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:57.716437 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9676c2f1_f258_4d20_af54_415ffdb94e09.slice/crio-8babb83594b6340158d301cb02fea2f5e129720b940ce2f824a90f5c29af868a WatchSource:0}: Error finding container 8babb83594b6340158d301cb02fea2f5e129720b940ce2f824a90f5c29af868a: Status 404 returned error can't find the container with id 8babb83594b6340158d301cb02fea2f5e129720b940ce2f824a90f5c29af868a Apr 20 14:26:57.720228 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.720207 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9gnvx" Apr 20 14:26:57.721865 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:57.721846 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb0116b2_344f_40ec_a0a2_b47e8cb06248.slice/crio-5170db1abb100d70dd4f5544c68d946e1757550bed1794fa0b6468093d8ebb20 WatchSource:0}: Error finding container 5170db1abb100d70dd4f5544c68d946e1757550bed1794fa0b6468093d8ebb20: Status 404 returned error can't find the container with id 5170db1abb100d70dd4f5544c68d946e1757550bed1794fa0b6468093d8ebb20 Apr 20 14:26:57.726153 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:26:57.726109 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64d6d293_736d_4b7e_86ca_9b3decf1c068.slice/crio-d0912935f82317e7941f881950fcdb1817d4af99ff1dd56cf15e438a7df88427 WatchSource:0}: Error finding container d0912935f82317e7941f881950fcdb1817d4af99ff1dd56cf15e438a7df88427: Status 404 returned error can't find the container with id d0912935f82317e7941f881950fcdb1817d4af99ff1dd56cf15e438a7df88427 Apr 20 14:26:57.953709 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:57.953615 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs\") pod \"network-metrics-daemon-g7tjl\" (UID: \"19b3975f-609f-427a-a428-db9cb8176eec\") " pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:26:57.953872 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:57.953780 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:26:57.953872 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:57.953843 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs podName:19b3975f-609f-427a-a428-db9cb8176eec nodeName:}" failed. No retries permitted until 2026-04-20 14:26:58.953822996 +0000 UTC m=+3.160815427 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs") pod "network-metrics-daemon-g7tjl" (UID: "19b3975f-609f-427a-a428-db9cb8176eec") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:26:58.054266 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:58.054210 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnxs\" (UniqueName: \"kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs\") pod \"network-check-target-dgqlp\" (UID: \"e64bf075-31c6-4e04-b3fc-8dddfabaff6a\") " pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:26:58.054453 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:58.054394 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:26:58.054453 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:58.054420 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:26:58.054453 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:58.054433 2572 projected.go:194] Error preparing data for projected volume kube-api-access-nhnxs for pod openshift-network-diagnostics/network-check-target-dgqlp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:26:58.054624 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:58.054492 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs podName:e64bf075-31c6-4e04-b3fc-8dddfabaff6a nodeName:}" failed. No retries permitted until 2026-04-20 14:26:59.054473334 +0000 UTC m=+3.261465756 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-nhnxs" (UniqueName: "kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs") pod "network-check-target-dgqlp" (UID: "e64bf075-31c6-4e04-b3fc-8dddfabaff6a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:26:58.201277 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:58.201241 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:26:58.344625 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:58.344573 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:26:58.376059 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:58.375995 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:21:57 +0000 UTC" deadline="2028-01-06 03:37:44.861157394 +0000 UTC" Apr 20 14:26:58.376059 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:58.376033 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15013h10m46.485128197s" Apr 20 14:26:58.462593 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:58.462561 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:26:58.462756 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:58.462712 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgqlp" podUID="e64bf075-31c6-4e04-b3fc-8dddfabaff6a" Apr 20 14:26:58.484872 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:58.484364 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:26:58.491320 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:58.491282 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-f759k" event={"ID":"db0116b2-344f-40ec-a0a2-b47e8cb06248","Type":"ContainerStarted","Data":"5170db1abb100d70dd4f5544c68d946e1757550bed1794fa0b6468093d8ebb20"} Apr 20 14:26:58.503173 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:58.503095 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" event={"ID":"a649d03f-abf5-42ca-849e-903f5fdc0299","Type":"ContainerStarted","Data":"52d7ebe0e901ff206638cdbb57fb3452e67de6ffe1144dcd77d2a597da692f94"} Apr 20 14:26:58.504834 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:58.504768 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-52mwl" event={"ID":"79f008e1-fae4-4db2-bdc8-eb46cf52e662","Type":"ContainerStarted","Data":"8c6a53bc7369a13d8232b2b00681a1f8535605e709b36d44ff53f732bee10256"} Apr 20 14:26:58.508876 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:58.508807 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w77dh" event={"ID":"26818813-da84-407b-b55f-77d9ffcbb474","Type":"ContainerStarted","Data":"dd7172d44615505c940cf0a971f132609db6c2f46022779da33ed4bcee867460"} Apr 20 14:26:58.515105 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:58.515059 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" event={"ID":"e1834aa4-27df-4840-ae0e-2df474de8c48","Type":"ContainerStarted","Data":"e6fd1bf2e7f9de39eff219141f08958b214e1f7badff0badf49a39acb39b21b7"} Apr 20 14:26:58.526024 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:58.525983 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9gnvx" event={"ID":"64d6d293-736d-4b7e-86ca-9b3decf1c068","Type":"ContainerStarted","Data":"d0912935f82317e7941f881950fcdb1817d4af99ff1dd56cf15e438a7df88427"} Apr 20 14:26:58.535913 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:58.535875 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vw667" event={"ID":"9676c2f1-f258-4d20-af54-415ffdb94e09","Type":"ContainerStarted","Data":"8babb83594b6340158d301cb02fea2f5e129720b940ce2f824a90f5c29af868a"} Apr 20 14:26:58.552983 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:58.552940 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-p697p" event={"ID":"35778b02-a4c0-4a27-907b-2c96d1273465","Type":"ContainerStarted","Data":"93df730b604912e41ec1aadcd438fd57bf08a550037552077f8d4a6036e4e459"} Apr 20 14:26:58.565047 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:58.563180 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgc2" event={"ID":"9798cc7a-2b24-4609-adaf-fd0cb6fa296b","Type":"ContainerStarted","Data":"c0167c635217ec58c8398691fdf119ef7f50a9bbac47430910cedf76ddb26cd1"} Apr 20 14:26:58.963555 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:58.963511 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs\") pod \"network-metrics-daemon-g7tjl\" (UID: \"19b3975f-609f-427a-a428-db9cb8176eec\") " pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:26:58.963762 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:58.963691 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:26:58.963762 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:58.963759 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs podName:19b3975f-609f-427a-a428-db9cb8176eec nodeName:}" failed. No retries permitted until 2026-04-20 14:27:00.963740519 +0000 UTC m=+5.170732943 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs") pod "network-metrics-daemon-g7tjl" (UID: "19b3975f-609f-427a-a428-db9cb8176eec") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:26:59.063984 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:59.063945 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnxs\" (UniqueName: \"kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs\") pod \"network-check-target-dgqlp\" (UID: \"e64bf075-31c6-4e04-b3fc-8dddfabaff6a\") " pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:26:59.064197 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:59.064142 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:26:59.064197 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:59.064171 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:26:59.064197 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:59.064184 2572 projected.go:194] Error preparing data for projected volume kube-api-access-nhnxs for pod openshift-network-diagnostics/network-check-target-dgqlp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:26:59.064362 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:59.064250 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs podName:e64bf075-31c6-4e04-b3fc-8dddfabaff6a nodeName:}" failed. No retries permitted until 2026-04-20 14:27:01.064229721 +0000 UTC m=+5.271222160 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-nhnxs" (UniqueName: "kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs") pod "network-check-target-dgqlp" (UID: "e64bf075-31c6-4e04-b3fc-8dddfabaff6a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:26:59.377322 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:59.377220 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:21:57 +0000 UTC" deadline="2027-12-13 07:57:33.122292335 +0000 UTC" Apr 20 14:26:59.377322 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:59.377266 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14441h30m33.745032055s" Apr 20 14:26:59.463069 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:26:59.462475 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:26:59.463069 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:26:59.462628 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7tjl" podUID="19b3975f-609f-427a-a428-db9cb8176eec" Apr 20 14:27:00.463206 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:00.463101 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:00.463689 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:00.463251 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgqlp" podUID="e64bf075-31c6-4e04-b3fc-8dddfabaff6a" Apr 20 14:27:00.569064 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:00.568198 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-vkzx4"] Apr 20 14:27:00.571436 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:00.571412 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:00.571563 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:00.571499 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vkzx4" podUID="5fc6b1ee-788e-4792-8454-48a99a628442" Apr 20 14:27:00.677514 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:00.677116 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5fc6b1ee-788e-4792-8454-48a99a628442-kubelet-config\") pod \"global-pull-secret-syncer-vkzx4\" (UID: \"5fc6b1ee-788e-4792-8454-48a99a628442\") " pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:00.677514 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:00.677343 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5fc6b1ee-788e-4792-8454-48a99a628442-dbus\") pod \"global-pull-secret-syncer-vkzx4\" (UID: \"5fc6b1ee-788e-4792-8454-48a99a628442\") " pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:00.677514 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:00.677397 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret\") pod \"global-pull-secret-syncer-vkzx4\" (UID: \"5fc6b1ee-788e-4792-8454-48a99a628442\") " pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:00.779422 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:00.778531 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5fc6b1ee-788e-4792-8454-48a99a628442-kubelet-config\") pod \"global-pull-secret-syncer-vkzx4\" (UID: \"5fc6b1ee-788e-4792-8454-48a99a628442\") " pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:00.779422 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:00.778619 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5fc6b1ee-788e-4792-8454-48a99a628442-dbus\") pod \"global-pull-secret-syncer-vkzx4\" (UID: \"5fc6b1ee-788e-4792-8454-48a99a628442\") " pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:00.779422 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:00.778655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret\") pod \"global-pull-secret-syncer-vkzx4\" (UID: \"5fc6b1ee-788e-4792-8454-48a99a628442\") " pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:00.779422 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:00.778776 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:00.779422 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:00.778838 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret podName:5fc6b1ee-788e-4792-8454-48a99a628442 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:01.278819503 +0000 UTC m=+5.485811937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret") pod "global-pull-secret-syncer-vkzx4" (UID: "5fc6b1ee-788e-4792-8454-48a99a628442") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:00.779422 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:00.779261 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5fc6b1ee-788e-4792-8454-48a99a628442-kubelet-config\") pod \"global-pull-secret-syncer-vkzx4\" (UID: \"5fc6b1ee-788e-4792-8454-48a99a628442\") " pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:00.779422 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:00.779337 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5fc6b1ee-788e-4792-8454-48a99a628442-dbus\") pod \"global-pull-secret-syncer-vkzx4\" (UID: \"5fc6b1ee-788e-4792-8454-48a99a628442\") " pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:00.980979 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:00.980936 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs\") pod \"network-metrics-daemon-g7tjl\" (UID: \"19b3975f-609f-427a-a428-db9cb8176eec\") " pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:00.981191 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:00.981031 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:00.981191 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:00.981103 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs podName:19b3975f-609f-427a-a428-db9cb8176eec nodeName:}" failed. No retries permitted until 2026-04-20 14:27:04.981080797 +0000 UTC m=+9.188073237 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs") pod "network-metrics-daemon-g7tjl" (UID: "19b3975f-609f-427a-a428-db9cb8176eec") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:01.082655 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:01.081932 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnxs\" (UniqueName: \"kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs\") pod \"network-check-target-dgqlp\" (UID: \"e64bf075-31c6-4e04-b3fc-8dddfabaff6a\") " pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:01.082655 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:01.082139 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:27:01.082655 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:01.082163 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:27:01.082655 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:01.082178 2572 projected.go:194] Error preparing data for projected volume kube-api-access-nhnxs for pod openshift-network-diagnostics/network-check-target-dgqlp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:01.082655 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:01.082239 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs podName:e64bf075-31c6-4e04-b3fc-8dddfabaff6a nodeName:}" failed. No retries permitted until 2026-04-20 14:27:05.082220121 +0000 UTC m=+9.289212557 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-nhnxs" (UniqueName: "kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs") pod "network-check-target-dgqlp" (UID: "e64bf075-31c6-4e04-b3fc-8dddfabaff6a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:01.283507 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:01.283360 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret\") pod \"global-pull-secret-syncer-vkzx4\" (UID: \"5fc6b1ee-788e-4792-8454-48a99a628442\") " pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:01.283507 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:01.283512 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:01.283730 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:01.283585 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret podName:5fc6b1ee-788e-4792-8454-48a99a628442 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:02.283565434 +0000 UTC m=+6.490557857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret") pod "global-pull-secret-syncer-vkzx4" (UID: "5fc6b1ee-788e-4792-8454-48a99a628442") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:01.463302 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:01.463263 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:01.463793 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:01.463422 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7tjl" podUID="19b3975f-609f-427a-a428-db9cb8176eec" Apr 20 14:27:02.292096 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:02.292054 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret\") pod \"global-pull-secret-syncer-vkzx4\" (UID: \"5fc6b1ee-788e-4792-8454-48a99a628442\") " pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:02.292302 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:02.292238 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:02.292369 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:02.292327 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret podName:5fc6b1ee-788e-4792-8454-48a99a628442 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:04.292286205 +0000 UTC m=+8.499278638 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret") pod "global-pull-secret-syncer-vkzx4" (UID: "5fc6b1ee-788e-4792-8454-48a99a628442") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:02.466722 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:02.466689 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:02.467240 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:02.466822 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgqlp" podUID="e64bf075-31c6-4e04-b3fc-8dddfabaff6a" Apr 20 14:27:02.467240 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:02.466981 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:02.467240 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:02.467104 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vkzx4" podUID="5fc6b1ee-788e-4792-8454-48a99a628442" Apr 20 14:27:03.462937 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:03.462896 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:03.463109 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:03.463039 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7tjl" podUID="19b3975f-609f-427a-a428-db9cb8176eec" Apr 20 14:27:04.308939 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:04.308870 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret\") pod \"global-pull-secret-syncer-vkzx4\" (UID: \"5fc6b1ee-788e-4792-8454-48a99a628442\") " pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:04.309428 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:04.309028 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:04.309428 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:04.309103 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret podName:5fc6b1ee-788e-4792-8454-48a99a628442 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:08.30908353 +0000 UTC m=+12.516075963 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret") pod "global-pull-secret-syncer-vkzx4" (UID: "5fc6b1ee-788e-4792-8454-48a99a628442") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:04.462652 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:04.462581 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:04.462652 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:04.462607 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:04.462894 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:04.462720 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vkzx4" podUID="5fc6b1ee-788e-4792-8454-48a99a628442" Apr 20 14:27:04.462894 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:04.462841 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgqlp" podUID="e64bf075-31c6-4e04-b3fc-8dddfabaff6a" Apr 20 14:27:05.015258 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:05.015214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs\") pod \"network-metrics-daemon-g7tjl\" (UID: \"19b3975f-609f-427a-a428-db9cb8176eec\") " pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:05.015459 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:05.015364 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:05.015459 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:05.015435 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs podName:19b3975f-609f-427a-a428-db9cb8176eec nodeName:}" failed. No retries permitted until 2026-04-20 14:27:13.015415573 +0000 UTC m=+17.222407991 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs") pod "network-metrics-daemon-g7tjl" (UID: "19b3975f-609f-427a-a428-db9cb8176eec") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:05.116632 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:05.116595 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnxs\" (UniqueName: \"kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs\") pod \"network-check-target-dgqlp\" (UID: \"e64bf075-31c6-4e04-b3fc-8dddfabaff6a\") " pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:05.116820 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:05.116781 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:27:05.116820 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:05.116811 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:27:05.116933 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:05.116826 2572 projected.go:194] Error preparing data for projected volume kube-api-access-nhnxs for pod openshift-network-diagnostics/network-check-target-dgqlp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:05.116933 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:05.116892 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs podName:e64bf075-31c6-4e04-b3fc-8dddfabaff6a nodeName:}" failed. No retries permitted until 2026-04-20 14:27:13.116871235 +0000 UTC m=+17.323863657 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-nhnxs" (UniqueName: "kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs") pod "network-check-target-dgqlp" (UID: "e64bf075-31c6-4e04-b3fc-8dddfabaff6a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:05.462922 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:05.462881 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:05.463443 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:05.463021 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7tjl" podUID="19b3975f-609f-427a-a428-db9cb8176eec" Apr 20 14:27:06.465702 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:06.465660 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:06.466151 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:06.465656 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:06.466151 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:06.465785 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vkzx4" podUID="5fc6b1ee-788e-4792-8454-48a99a628442" Apr 20 14:27:06.466151 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:06.465877 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgqlp" podUID="e64bf075-31c6-4e04-b3fc-8dddfabaff6a" Apr 20 14:27:07.462683 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:07.462646 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:07.462882 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:07.462762 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7tjl" podUID="19b3975f-609f-427a-a428-db9cb8176eec" Apr 20 14:27:08.342445 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:08.342400 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret\") pod \"global-pull-secret-syncer-vkzx4\" (UID: \"5fc6b1ee-788e-4792-8454-48a99a628442\") " pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:08.342898 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:08.342576 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:08.342898 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:08.342661 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret podName:5fc6b1ee-788e-4792-8454-48a99a628442 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:16.342640056 +0000 UTC m=+20.549632474 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret") pod "global-pull-secret-syncer-vkzx4" (UID: "5fc6b1ee-788e-4792-8454-48a99a628442") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:08.463047 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:08.463017 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:08.463243 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:08.463024 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:08.463243 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:08.463158 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgqlp" podUID="e64bf075-31c6-4e04-b3fc-8dddfabaff6a" Apr 20 14:27:08.463243 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:08.463227 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vkzx4" podUID="5fc6b1ee-788e-4792-8454-48a99a628442" Apr 20 14:27:09.462470 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:09.462431 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:09.462885 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:09.462573 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7tjl" podUID="19b3975f-609f-427a-a428-db9cb8176eec" Apr 20 14:27:10.463194 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:10.463154 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:10.463632 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:10.463169 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:10.463632 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:10.463282 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgqlp" podUID="e64bf075-31c6-4e04-b3fc-8dddfabaff6a" Apr 20 14:27:10.463632 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:10.463330 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vkzx4" podUID="5fc6b1ee-788e-4792-8454-48a99a628442" Apr 20 14:27:11.462957 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:11.462923 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:11.463174 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:11.463046 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7tjl" podUID="19b3975f-609f-427a-a428-db9cb8176eec" Apr 20 14:27:12.462599 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:12.462561 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:12.463034 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:12.462612 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:12.463034 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:12.462703 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vkzx4" podUID="5fc6b1ee-788e-4792-8454-48a99a628442" Apr 20 14:27:12.463034 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:12.462843 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgqlp" podUID="e64bf075-31c6-4e04-b3fc-8dddfabaff6a" Apr 20 14:27:13.074892 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:13.074854 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs\") pod \"network-metrics-daemon-g7tjl\" (UID: \"19b3975f-609f-427a-a428-db9cb8176eec\") " pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:13.075106 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:13.075012 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:13.075106 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:13.075084 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs podName:19b3975f-609f-427a-a428-db9cb8176eec nodeName:}" failed. No retries permitted until 2026-04-20 14:27:29.075067446 +0000 UTC m=+33.282059863 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs") pod "network-metrics-daemon-g7tjl" (UID: "19b3975f-609f-427a-a428-db9cb8176eec") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:13.175600 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:13.175552 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnxs\" (UniqueName: \"kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs\") pod \"network-check-target-dgqlp\" (UID: \"e64bf075-31c6-4e04-b3fc-8dddfabaff6a\") " pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:13.175780 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:13.175745 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:27:13.175780 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:13.175773 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:27:13.175901 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:13.175788 2572 projected.go:194] Error preparing data for projected volume kube-api-access-nhnxs for pod openshift-network-diagnostics/network-check-target-dgqlp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:13.175901 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:13.175863 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs podName:e64bf075-31c6-4e04-b3fc-8dddfabaff6a nodeName:}" failed. No retries permitted until 2026-04-20 14:27:29.175841871 +0000 UTC m=+33.382834303 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-nhnxs" (UniqueName: "kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs") pod "network-check-target-dgqlp" (UID: "e64bf075-31c6-4e04-b3fc-8dddfabaff6a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:13.462965 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:13.462925 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:13.463432 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:13.463051 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7tjl" podUID="19b3975f-609f-427a-a428-db9cb8176eec" Apr 20 14:27:14.463045 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:14.463010 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:14.463484 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:14.463052 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:14.463484 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:14.463137 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vkzx4" podUID="5fc6b1ee-788e-4792-8454-48a99a628442" Apr 20 14:27:14.463484 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:14.463228 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgqlp" podUID="e64bf075-31c6-4e04-b3fc-8dddfabaff6a" Apr 20 14:27:15.462626 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:15.462591 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:15.462809 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:15.462703 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7tjl" podUID="19b3975f-609f-427a-a428-db9cb8176eec" Apr 20 14:27:16.397837 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:16.397410 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret\") pod \"global-pull-secret-syncer-vkzx4\" (UID: \"5fc6b1ee-788e-4792-8454-48a99a628442\") " pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:16.398735 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:16.397548 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:16.398735 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:16.397958 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret podName:5fc6b1ee-788e-4792-8454-48a99a628442 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:32.397936233 +0000 UTC m=+36.604928668 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret") pod "global-pull-secret-syncer-vkzx4" (UID: "5fc6b1ee-788e-4792-8454-48a99a628442") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:16.463478 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:16.463445 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:16.463629 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:16.463530 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:16.463698 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:16.463643 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vkzx4" podUID="5fc6b1ee-788e-4792-8454-48a99a628442" Apr 20 14:27:16.463787 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:16.463766 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgqlp" podUID="e64bf075-31c6-4e04-b3fc-8dddfabaff6a" Apr 20 14:27:16.605883 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:16.605782 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vw667" event={"ID":"9676c2f1-f258-4d20-af54-415ffdb94e09","Type":"ContainerStarted","Data":"433aa56239f3c6b741fb7a9bfc99814ddd7b139f378d90160921f09a9ccc503c"} Apr 20 14:27:16.610291 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:16.610266 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 14:27:16.611352 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:16.611202 2572 generic.go:358] "Generic (PLEG): container finished" podID="a649d03f-abf5-42ca-849e-903f5fdc0299" containerID="828fd06b2526486bc9cfbaa93a2cbd821825b520eaa949da24ba6a400794e17b" exitCode=1 Apr 20 14:27:16.611352 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:16.611298 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" event={"ID":"a649d03f-abf5-42ca-849e-903f5fdc0299","Type":"ContainerStarted","Data":"74c67084187e82462aad93f999ff03f457adc08679ee6dc58a0298351b20c9df"} Apr 20 14:27:16.611352 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:16.611329 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" event={"ID":"a649d03f-abf5-42ca-849e-903f5fdc0299","Type":"ContainerStarted","Data":"b22d293aa7cbd01a77d8562beddcaebdd3cd8b5d750c55e7ed1959a33e20191a"} Apr 20 14:27:16.611352 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:16.611346 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" event={"ID":"a649d03f-abf5-42ca-849e-903f5fdc0299","Type":"ContainerStarted","Data":"9cd7e848267fcfe32171be93649ecc06b3885b5c96814009d3c93238e684d8f8"} Apr 20 14:27:16.611352 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:16.611359 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" event={"ID":"a649d03f-abf5-42ca-849e-903f5fdc0299","Type":"ContainerStarted","Data":"948448f89c5109f05d90c263e2bc8f6f5eac45382e5246cfa9e11f1d5a64bc97"} Apr 20 14:27:16.612488 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:16.611373 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" event={"ID":"a649d03f-abf5-42ca-849e-903f5fdc0299","Type":"ContainerDied","Data":"828fd06b2526486bc9cfbaa93a2cbd821825b520eaa949da24ba6a400794e17b"} Apr 20 14:27:16.612488 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:16.611389 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" event={"ID":"a649d03f-abf5-42ca-849e-903f5fdc0299","Type":"ContainerStarted","Data":"cb0b22c5bab0b52fbbbcb3a9ae111ccfaeb01fa1a6d86737538f2534a63eecf7"} Apr 20 14:27:16.617362 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:16.617322 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-30.ec2.internal" event={"ID":"c8ce0f8ff8898a2f19fd2c9a4c4f3273","Type":"ContainerStarted","Data":"c9d25e76185a6f6bcbb43dfdaad3b94e1e24c07dd6d146a8a01c54b4feda664f"} Apr 20 14:27:16.619449 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:16.619423 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9gnvx" event={"ID":"64d6d293-736d-4b7e-86ca-9b3decf1c068","Type":"ContainerStarted","Data":"9d91e2612361bf492e1df2f7261d46e889932e6f35e5e01ac91c2182a56b2652"} Apr 20 14:27:16.623256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:16.623206 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vw667" podStartSLOduration=2.5684286419999998 podStartE2EDuration="20.623189573s" podCreationTimestamp="2026-04-20 14:26:56 +0000 UTC" firstStartedPulling="2026-04-20 14:26:57.718840924 +0000 UTC m=+1.925833342" lastFinishedPulling="2026-04-20 14:27:15.773601856 +0000 UTC m=+19.980594273" observedRunningTime="2026-04-20 14:27:16.623008891 +0000 UTC m=+20.830001328" watchObservedRunningTime="2026-04-20 14:27:16.623189573 +0000 UTC m=+20.830182013" Apr 20 14:27:16.637539 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:16.636966 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-30.ec2.internal" podStartSLOduration=20.636948313 podStartE2EDuration="20.636948313s" podCreationTimestamp="2026-04-20 14:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:27:16.63622433 +0000 UTC m=+20.843216770" watchObservedRunningTime="2026-04-20 14:27:16.636948313 +0000 UTC m=+20.843940752" Apr 20 14:27:16.653881 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:16.653843 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9gnvx" podStartSLOduration=2.575944137 podStartE2EDuration="20.653830031s" podCreationTimestamp="2026-04-20 14:26:56 +0000 UTC" firstStartedPulling="2026-04-20 14:26:57.727604721 +0000 UTC m=+1.934597138" lastFinishedPulling="2026-04-20 14:27:15.805490615 +0000 UTC m=+20.012483032" observedRunningTime="2026-04-20 14:27:16.653599802 +0000 UTC m=+20.860592242" watchObservedRunningTime="2026-04-20 14:27:16.653830031 +0000 UTC m=+20.860822527" Apr 20 14:27:17.462870 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:17.462702 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:17.463303 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:17.462953 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7tjl" podUID="19b3975f-609f-427a-a428-db9cb8176eec" Apr 20 14:27:17.574937 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:17.574914 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 14:27:17.622319 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:17.622237 2572 generic.go:358] "Generic (PLEG): container finished" podID="1028cc4a1cd2bb7f24fa2491ce4493a5" containerID="5aff4649eb1339bb7808ea1e8995a67d1b2930a98d2b1d3765fb3fc515abed2b" exitCode=0 Apr 20 14:27:17.622319 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:17.622301 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal" event={"ID":"1028cc4a1cd2bb7f24fa2491ce4493a5","Type":"ContainerDied","Data":"5aff4649eb1339bb7808ea1e8995a67d1b2930a98d2b1d3765fb3fc515abed2b"} Apr 20 14:27:17.623604 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:17.623583 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-f759k" event={"ID":"db0116b2-344f-40ec-a0a2-b47e8cb06248","Type":"ContainerStarted","Data":"6b7a489a2c336d1e9f9f278d095ed4e42ca28f79b84f3f8eb80b8a4fe05819b9"} Apr 20 14:27:17.625000 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:17.624965 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-52mwl" event={"ID":"79f008e1-fae4-4db2-bdc8-eb46cf52e662","Type":"ContainerStarted","Data":"b68676998675f336d15f174aa35c6227f4a7dc4cffa42c41f176a6bf717f679a"} Apr 20 14:27:17.629103 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:17.629074 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w77dh" event={"ID":"26818813-da84-407b-b55f-77d9ffcbb474","Type":"ContainerStarted","Data":"6caa4a7c71b7d9dc8d64d7f560b2f317e2e20d726c606515387eefd44e9815b4"} Apr 20 14:27:17.630669 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:17.630639 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" event={"ID":"e1834aa4-27df-4840-ae0e-2df474de8c48","Type":"ContainerStarted","Data":"ff746a6082a6b6b7fb5aa144093ae18fd4f51c25238cb22f3ed02fd3c04a6878"} Apr 20 14:27:17.630755 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:17.630677 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" event={"ID":"e1834aa4-27df-4840-ae0e-2df474de8c48","Type":"ContainerStarted","Data":"2280f6e868340703a4d6297bac5949a94acf7355b9812db3fc58fd24052f518b"} Apr 20 14:27:17.631846 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:17.631830 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-p697p" event={"ID":"35778b02-a4c0-4a27-907b-2c96d1273465","Type":"ContainerStarted","Data":"9e7dc24271810a802ecf336dc988a3f70f4ccd40c3dde8129147b588eb38c765"} Apr 20 14:27:17.633154 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:17.633115 2572 generic.go:358] "Generic (PLEG): container finished" podID="9798cc7a-2b24-4609-adaf-fd0cb6fa296b" containerID="92b256c7cbc6032955eaa485d9dd1239f68c0a091c2cdaa4deef141ab5f196d2" exitCode=0 Apr 20 14:27:17.633247 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:17.633160 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgc2" event={"ID":"9798cc7a-2b24-4609-adaf-fd0cb6fa296b","Type":"ContainerDied","Data":"92b256c7cbc6032955eaa485d9dd1239f68c0a091c2cdaa4deef141ab5f196d2"} Apr 20 14:27:17.676371 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:17.676327 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-f759k" podStartSLOduration=3.627298754 podStartE2EDuration="21.676310136s" podCreationTimestamp="2026-04-20 14:26:56 +0000 UTC" firstStartedPulling="2026-04-20 14:26:57.723610728 +0000 UTC m=+1.930603145" lastFinishedPulling="2026-04-20 14:27:15.772622097 +0000 UTC m=+19.979614527" observedRunningTime="2026-04-20 14:27:17.65574602 +0000 UTC m=+21.862738461" watchObservedRunningTime="2026-04-20 14:27:17.676310136 +0000 UTC m=+21.883302577" Apr 20 14:27:17.689735 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:17.689678 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-52mwl" podStartSLOduration=3.615091156 podStartE2EDuration="21.689661983s" podCreationTimestamp="2026-04-20 14:26:56 +0000 UTC" firstStartedPulling="2026-04-20 14:26:57.699318999 +0000 UTC m=+1.906311415" lastFinishedPulling="2026-04-20 14:27:15.773889808 +0000 UTC m=+19.980882242" observedRunningTime="2026-04-20 14:27:17.689276925 +0000 UTC m=+21.896269364" watchObservedRunningTime="2026-04-20 14:27:17.689661983 +0000 UTC m=+21.896654424" Apr 20 14:27:17.703090 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:17.703048 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-p697p" podStartSLOduration=3.643115655 podStartE2EDuration="21.703035022s" podCreationTimestamp="2026-04-20 14:26:56 +0000 UTC" firstStartedPulling="2026-04-20 14:26:57.71344262 +0000 UTC m=+1.920435037" lastFinishedPulling="2026-04-20 14:27:15.773361986 +0000 UTC m=+19.980354404" observedRunningTime="2026-04-20 14:27:17.702547877 +0000 UTC m=+21.909540328" watchObservedRunningTime="2026-04-20 14:27:17.703035022 +0000 UTC m=+21.910027461" Apr 20 14:27:17.717757 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:17.717715 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w77dh" podStartSLOduration=3.639186301 podStartE2EDuration="21.717700725s" podCreationTimestamp="2026-04-20 14:26:56 +0000 UTC" firstStartedPulling="2026-04-20 14:26:57.693750096 +0000 UTC m=+1.900742531" lastFinishedPulling="2026-04-20 14:27:15.772264532 +0000 UTC m=+19.979256955" observedRunningTime="2026-04-20 14:27:17.717430652 +0000 UTC m=+21.924423090" watchObservedRunningTime="2026-04-20 14:27:17.717700725 +0000 UTC m=+21.924693164" Apr 20 14:27:18.406989 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:18.406894 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T14:27:17.574931577Z","UUID":"e2ae56f0-1025-47e9-95ff-ae5aab3d0eac","Handler":null,"Name":"","Endpoint":""} Apr 20 14:27:18.408361 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:18.408340 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 14:27:18.408464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:18.408372 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 14:27:18.462417 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:18.462385 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:18.462573 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:18.462425 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:18.462573 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:18.462486 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgqlp" podUID="e64bf075-31c6-4e04-b3fc-8dddfabaff6a" Apr 20 14:27:18.462649 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:18.462621 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vkzx4" podUID="5fc6b1ee-788e-4792-8454-48a99a628442" Apr 20 14:27:18.637946 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:18.637651 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal" event={"ID":"1028cc4a1cd2bb7f24fa2491ce4493a5","Type":"ContainerStarted","Data":"eed743957133f2c34017463281835b30323ead131f1f46c113c9f88f9d88d6b4"} Apr 20 14:27:18.641059 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:18.641018 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 14:27:18.641441 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:18.641371 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" event={"ID":"a649d03f-abf5-42ca-849e-903f5fdc0299","Type":"ContainerStarted","Data":"4bdbc6539fdd80be561317a5f8b38dd548b9be97627e1bd305b3251536a5d1ac"} Apr 20 14:27:18.643393 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:18.643319 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" event={"ID":"e1834aa4-27df-4840-ae0e-2df474de8c48","Type":"ContainerStarted","Data":"b40e8a5fa64ecf9a3ae60d203d10026ac447a26d187b015da85cb37e9191c0bf"} Apr 20 14:27:18.652808 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:18.652758 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-30.ec2.internal" podStartSLOduration=22.652740782 podStartE2EDuration="22.652740782s" podCreationTimestamp="2026-04-20 14:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:27:18.652280276 +0000 UTC m=+22.859272716" watchObservedRunningTime="2026-04-20 14:27:18.652740782 +0000 UTC m=+22.859733224" Apr 20 14:27:18.668272 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:18.668222 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jnnqb" podStartSLOduration=2.024265679 podStartE2EDuration="22.668204845s" podCreationTimestamp="2026-04-20 14:26:56 +0000 UTC" firstStartedPulling="2026-04-20 14:26:57.666334924 +0000 UTC m=+1.873327341" lastFinishedPulling="2026-04-20 14:27:18.310274091 +0000 UTC m=+22.517266507" observedRunningTime="2026-04-20 14:27:18.667383574 +0000 UTC m=+22.874376037" watchObservedRunningTime="2026-04-20 14:27:18.668204845 +0000 UTC m=+22.875197287" Apr 20 14:27:19.462785 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:19.462749 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:19.463012 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:19.462889 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7tjl" podUID="19b3975f-609f-427a-a428-db9cb8176eec" Apr 20 14:27:20.462412 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:20.462378 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:20.462778 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:20.462483 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgqlp" podUID="e64bf075-31c6-4e04-b3fc-8dddfabaff6a" Apr 20 14:27:20.462778 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:20.462548 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:20.462778 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:20.462634 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vkzx4" podUID="5fc6b1ee-788e-4792-8454-48a99a628442" Apr 20 14:27:21.462597 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:21.462543 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:21.463034 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:21.462702 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7tjl" podUID="19b3975f-609f-427a-a428-db9cb8176eec" Apr 20 14:27:22.295038 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:22.294955 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-p697p" Apr 20 14:27:22.295575 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:22.295554 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-p697p" Apr 20 14:27:22.463056 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:22.462797 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:22.463491 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:22.462806 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:22.463491 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:22.463191 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vkzx4" podUID="5fc6b1ee-788e-4792-8454-48a99a628442" Apr 20 14:27:22.463491 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:22.463231 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgqlp" podUID="e64bf075-31c6-4e04-b3fc-8dddfabaff6a" Apr 20 14:27:22.651765 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:22.651732 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgc2" event={"ID":"9798cc7a-2b24-4609-adaf-fd0cb6fa296b","Type":"ContainerStarted","Data":"9ede7a06bd66f8768ecf8309af8baa01694b8e6a9b37c5e4430f84d21d6637a1"} Apr 20 14:27:22.654555 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:22.654536 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 14:27:22.654890 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:22.654867 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" event={"ID":"a649d03f-abf5-42ca-849e-903f5fdc0299","Type":"ContainerStarted","Data":"0129e2fd48416175c9e6af8224f532801ed1f574fd1c7764cb1c740ebcc1006a"} Apr 20 14:27:22.655206 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:22.655121 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-p697p" Apr 20 14:27:22.655206 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:22.655174 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:27:22.655361 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:22.655319 2572 scope.go:117] "RemoveContainer" containerID="828fd06b2526486bc9cfbaa93a2cbd821825b520eaa949da24ba6a400794e17b" Apr 20 14:27:22.655555 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:22.655532 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-p697p" Apr 20 14:27:22.674813 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:22.674772 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:27:23.462577 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:23.462542 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:23.462752 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:23.462659 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7tjl" podUID="19b3975f-609f-427a-a428-db9cb8176eec" Apr 20 14:27:23.662571 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:23.662539 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 14:27:23.663144 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:23.663036 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" event={"ID":"a649d03f-abf5-42ca-849e-903f5fdc0299","Type":"ContainerStarted","Data":"9b974b530e48a22575f5a16295c3299750807e35f2e5bd058e76d9da36f2a36a"} Apr 20 14:27:23.663305 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:23.663289 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:27:23.663445 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:23.663427 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:27:23.664851 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:23.664830 2572 generic.go:358] "Generic (PLEG): container finished" podID="9798cc7a-2b24-4609-adaf-fd0cb6fa296b" containerID="9ede7a06bd66f8768ecf8309af8baa01694b8e6a9b37c5e4430f84d21d6637a1" exitCode=0 Apr 20 14:27:23.664928 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:23.664890 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgc2" event={"ID":"9798cc7a-2b24-4609-adaf-fd0cb6fa296b","Type":"ContainerDied","Data":"9ede7a06bd66f8768ecf8309af8baa01694b8e6a9b37c5e4430f84d21d6637a1"} Apr 20 14:27:23.678201 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:23.678175 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:27:23.691915 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:23.691870 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" podStartSLOduration=9.420283635 podStartE2EDuration="27.691856341s" podCreationTimestamp="2026-04-20 14:26:56 +0000 UTC" firstStartedPulling="2026-04-20 14:26:57.705854306 +0000 UTC m=+1.912846738" lastFinishedPulling="2026-04-20 14:27:15.977427024 +0000 UTC m=+20.184419444" observedRunningTime="2026-04-20 14:27:23.690241797 +0000 UTC m=+27.897234236" watchObservedRunningTime="2026-04-20 14:27:23.691856341 +0000 UTC m=+27.898848780" Apr 20 14:27:24.030905 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:24.030868 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vkzx4"] Apr 20 14:27:24.031088 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:24.031004 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:24.031184 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:24.031093 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vkzx4" podUID="5fc6b1ee-788e-4792-8454-48a99a628442" Apr 20 14:27:24.033641 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:24.033606 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dgqlp"] Apr 20 14:27:24.034157 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:24.034061 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:24.035335 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:24.035301 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g7tjl"] Apr 20 14:27:24.035448 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:24.035431 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:24.035584 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:24.035561 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7tjl" podUID="19b3975f-609f-427a-a428-db9cb8176eec" Apr 20 14:27:24.037352 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:24.037321 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgqlp" podUID="e64bf075-31c6-4e04-b3fc-8dddfabaff6a" Apr 20 14:27:24.669137 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:24.669021 2572 generic.go:358] "Generic (PLEG): container finished" podID="9798cc7a-2b24-4609-adaf-fd0cb6fa296b" containerID="22591e4f60ad19ab6eb7b62a84d77d4fee7f77f2bbac5c95128bf486e0aac82e" exitCode=0 Apr 20 14:27:24.669703 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:24.669111 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgc2" event={"ID":"9798cc7a-2b24-4609-adaf-fd0cb6fa296b","Type":"ContainerDied","Data":"22591e4f60ad19ab6eb7b62a84d77d4fee7f77f2bbac5c95128bf486e0aac82e"} Apr 20 14:27:25.462301 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:25.462100 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:25.462424 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:25.462104 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:25.462424 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:25.462402 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7tjl" podUID="19b3975f-609f-427a-a428-db9cb8176eec" Apr 20 14:27:25.462493 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:25.462450 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vkzx4" podUID="5fc6b1ee-788e-4792-8454-48a99a628442" Apr 20 14:27:25.672675 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:25.672588 2572 generic.go:358] "Generic (PLEG): container finished" podID="9798cc7a-2b24-4609-adaf-fd0cb6fa296b" containerID="49452957682da89af7120027a5aceb1ed0e3b74ca0954c0f768cb423d9fad16b" exitCode=0 Apr 20 14:27:25.673012 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:25.672677 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgc2" event={"ID":"9798cc7a-2b24-4609-adaf-fd0cb6fa296b","Type":"ContainerDied","Data":"49452957682da89af7120027a5aceb1ed0e3b74ca0954c0f768cb423d9fad16b"} Apr 20 14:27:26.464332 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:26.464293 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:26.464510 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:26.464416 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgqlp" podUID="e64bf075-31c6-4e04-b3fc-8dddfabaff6a" Apr 20 14:27:27.462196 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:27.462164 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:27.462648 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:27.462172 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:27.462648 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:27.462265 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vkzx4" podUID="5fc6b1ee-788e-4792-8454-48a99a628442" Apr 20 14:27:27.462648 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:27.462396 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7tjl" podUID="19b3975f-609f-427a-a428-db9cb8176eec" Apr 20 14:27:28.463049 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:28.463010 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:28.463486 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:28.463176 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgqlp" podUID="e64bf075-31c6-4e04-b3fc-8dddfabaff6a" Apr 20 14:27:29.086262 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.086233 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-30.ec2.internal" event="NodeReady" Apr 20 14:27:29.086424 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.086399 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 14:27:29.088641 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.088614 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs\") pod \"network-metrics-daemon-g7tjl\" (UID: \"19b3975f-609f-427a-a428-db9cb8176eec\") " pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:29.088760 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:29.088745 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:29.088819 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:29.088808 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs podName:19b3975f-609f-427a-a428-db9cb8176eec nodeName:}" failed. No retries permitted until 2026-04-20 14:28:01.088788603 +0000 UTC m=+65.295781024 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs") pod "network-metrics-daemon-g7tjl" (UID: "19b3975f-609f-427a-a428-db9cb8176eec") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:29.131592 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.131551 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6ln9g"] Apr 20 14:27:29.162032 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.162002 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b2q9g"] Apr 20 14:27:29.162221 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.162152 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6ln9g" Apr 20 14:27:29.164820 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.164792 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5k2mg\"" Apr 20 14:27:29.164956 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.164793 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 14:27:29.164956 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.164792 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 14:27:29.174955 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.174927 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6ln9g"] Apr 20 14:27:29.174955 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.174958 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b2q9g"] Apr 20 14:27:29.175155 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.175063 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b2q9g" Apr 20 14:27:29.177638 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.177616 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 14:27:29.177759 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.177639 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 14:27:29.177759 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.177666 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 14:27:29.177759 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.177697 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-79jcr\"" Apr 20 14:27:29.189932 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.189905 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnxs\" (UniqueName: \"kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs\") pod \"network-check-target-dgqlp\" (UID: \"e64bf075-31c6-4e04-b3fc-8dddfabaff6a\") " pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:29.190100 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:29.190077 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:27:29.190227 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:29.190104 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:27:29.190227 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:29.190117 2572 projected.go:194] Error preparing data for projected volume kube-api-access-nhnxs for pod openshift-network-diagnostics/network-check-target-dgqlp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:29.190227 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:29.190194 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs podName:e64bf075-31c6-4e04-b3fc-8dddfabaff6a nodeName:}" failed. No retries permitted until 2026-04-20 14:28:01.190174634 +0000 UTC m=+65.397167055 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-nhnxs" (UniqueName: "kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs") pod "network-check-target-dgqlp" (UID: "e64bf075-31c6-4e04-b3fc-8dddfabaff6a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:29.291025 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.290987 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:27:29.291235 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.291077 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-config-volume\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:27:29.291235 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.291109 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-tmp-dir\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:27:29.291235 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.291168 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjkpp\" (UniqueName: \"kubernetes.io/projected/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-kube-api-access-rjkpp\") pod \"ingress-canary-b2q9g\" (UID: \"2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4\") " pod="openshift-ingress-canary/ingress-canary-b2q9g" Apr 20 14:27:29.291235 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.291199 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54pl5\" (UniqueName: \"kubernetes.io/projected/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-kube-api-access-54pl5\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:27:29.291235 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.291222 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert\") pod \"ingress-canary-b2q9g\" (UID: \"2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4\") " pod="openshift-ingress-canary/ingress-canary-b2q9g" Apr 20 14:27:29.391705 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.391603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-config-volume\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:27:29.391705 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.391657 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-tmp-dir\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:27:29.391705 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.391694 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjkpp\" (UniqueName: \"kubernetes.io/projected/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-kube-api-access-rjkpp\") pod \"ingress-canary-b2q9g\" (UID: \"2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4\") " pod="openshift-ingress-canary/ingress-canary-b2q9g" Apr 20 14:27:29.391985 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.391722 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54pl5\" (UniqueName: \"kubernetes.io/projected/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-kube-api-access-54pl5\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:27:29.391985 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.391742 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert\") pod \"ingress-canary-b2q9g\" (UID: \"2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4\") " pod="openshift-ingress-canary/ingress-canary-b2q9g" Apr 20 14:27:29.391985 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.391795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:27:29.391985 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:29.391893 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:29.391985 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:29.391969 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls podName:ed9eaae0-abf3-4e1a-89f2-760f6e63f14a nodeName:}" failed. No retries permitted until 2026-04-20 14:27:29.891947785 +0000 UTC m=+34.098940205 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls") pod "dns-default-6ln9g" (UID: "ed9eaae0-abf3-4e1a-89f2-760f6e63f14a") : secret "dns-default-metrics-tls" not found Apr 20 14:27:29.391985 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:29.391894 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:29.392249 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:29.392004 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert podName:2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:29.891995182 +0000 UTC m=+34.098987598 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert") pod "ingress-canary-b2q9g" (UID: "2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4") : secret "canary-serving-cert" not found Apr 20 14:27:29.392249 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.392102 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-tmp-dir\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:27:29.392249 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.392224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-config-volume\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:27:29.406189 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.406165 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54pl5\" (UniqueName: \"kubernetes.io/projected/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-kube-api-access-54pl5\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:27:29.406320 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.406286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjkpp\" (UniqueName: \"kubernetes.io/projected/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-kube-api-access-rjkpp\") pod \"ingress-canary-b2q9g\" (UID: \"2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4\") " pod="openshift-ingress-canary/ingress-canary-b2q9g" Apr 20 14:27:29.462230 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.462191 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:29.462230 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.462236 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:27:29.465262 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.465226 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xmk8q\"" Apr 20 14:27:29.465262 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.465259 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 14:27:29.465668 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.465304 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:27:29.895927 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.895883 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert\") pod \"ingress-canary-b2q9g\" (UID: \"2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4\") " pod="openshift-ingress-canary/ingress-canary-b2q9g" Apr 20 14:27:29.896232 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:29.895952 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:27:29.896232 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:29.896002 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:29.896232 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:29.896071 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:29.896232 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:29.896080 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert podName:2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:30.896062561 +0000 UTC m=+35.103054999 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert") pod "ingress-canary-b2q9g" (UID: "2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4") : secret "canary-serving-cert" not found Apr 20 14:27:29.896232 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:29.896119 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls podName:ed9eaae0-abf3-4e1a-89f2-760f6e63f14a nodeName:}" failed. No retries permitted until 2026-04-20 14:27:30.89610687 +0000 UTC m=+35.103099304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls") pod "dns-default-6ln9g" (UID: "ed9eaae0-abf3-4e1a-89f2-760f6e63f14a") : secret "dns-default-metrics-tls" not found Apr 20 14:27:30.462574 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:30.462541 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:27:30.466401 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:30.465861 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 14:27:30.466804 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:30.466762 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xgs6p\"" Apr 20 14:27:30.467071 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:30.467040 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 14:27:30.904400 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:30.904322 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert\") pod \"ingress-canary-b2q9g\" (UID: \"2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4\") " pod="openshift-ingress-canary/ingress-canary-b2q9g" Apr 20 14:27:30.904542 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:30.904413 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:27:30.904542 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:30.904452 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:30.904542 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:30.904533 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert podName:2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:32.904511145 +0000 UTC m=+37.111503565 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert") pod "ingress-canary-b2q9g" (UID: "2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4") : secret "canary-serving-cert" not found Apr 20 14:27:30.904655 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:30.904566 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:30.904655 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:30.904619 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls podName:ed9eaae0-abf3-4e1a-89f2-760f6e63f14a nodeName:}" failed. No retries permitted until 2026-04-20 14:27:32.90460672 +0000 UTC m=+37.111599154 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls") pod "dns-default-6ln9g" (UID: "ed9eaae0-abf3-4e1a-89f2-760f6e63f14a") : secret "dns-default-metrics-tls" not found Apr 20 14:27:32.415393 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:32.415100 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret\") pod \"global-pull-secret-syncer-vkzx4\" (UID: \"5fc6b1ee-788e-4792-8454-48a99a628442\") " pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:32.420416 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:32.420394 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5fc6b1ee-788e-4792-8454-48a99a628442-original-pull-secret\") pod \"global-pull-secret-syncer-vkzx4\" (UID: \"5fc6b1ee-788e-4792-8454-48a99a628442\") " pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:32.473431 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:32.473396 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkzx4" Apr 20 14:27:32.626646 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:32.626616 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vkzx4"] Apr 20 14:27:32.630114 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:27:32.630092 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fc6b1ee_788e_4792_8454_48a99a628442.slice/crio-25376bb4d9f90983ab3b816abc73746a30e14a4db194d63bc1eba419d66737c0 WatchSource:0}: Error finding container 25376bb4d9f90983ab3b816abc73746a30e14a4db194d63bc1eba419d66737c0: Status 404 returned error can't find the container with id 25376bb4d9f90983ab3b816abc73746a30e14a4db194d63bc1eba419d66737c0 Apr 20 14:27:32.689579 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:32.689502 2572 generic.go:358] "Generic (PLEG): container finished" podID="9798cc7a-2b24-4609-adaf-fd0cb6fa296b" containerID="4152af1930586df237e69450721d99e1d06ccdb87cc93a888e6855ac498049ae" exitCode=0 Apr 20 14:27:32.689730 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:32.689580 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgc2" event={"ID":"9798cc7a-2b24-4609-adaf-fd0cb6fa296b","Type":"ContainerDied","Data":"4152af1930586df237e69450721d99e1d06ccdb87cc93a888e6855ac498049ae"} Apr 20 14:27:32.690585 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:32.690560 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vkzx4" event={"ID":"5fc6b1ee-788e-4792-8454-48a99a628442","Type":"ContainerStarted","Data":"25376bb4d9f90983ab3b816abc73746a30e14a4db194d63bc1eba419d66737c0"} Apr 20 14:27:32.918826 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:32.918628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert\") pod \"ingress-canary-b2q9g\" (UID: \"2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4\") " pod="openshift-ingress-canary/ingress-canary-b2q9g" Apr 20 14:27:32.919011 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:32.918852 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:27:32.919011 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:32.918767 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:32.919011 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:32.918950 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert podName:2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:36.918929923 +0000 UTC m=+41.125922355 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert") pod "ingress-canary-b2q9g" (UID: "2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4") : secret "canary-serving-cert" not found Apr 20 14:27:32.919011 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:32.918962 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:32.919011 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:32.919005 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls podName:ed9eaae0-abf3-4e1a-89f2-760f6e63f14a nodeName:}" failed. No retries permitted until 2026-04-20 14:27:36.918991316 +0000 UTC m=+41.125983750 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls") pod "dns-default-6ln9g" (UID: "ed9eaae0-abf3-4e1a-89f2-760f6e63f14a") : secret "dns-default-metrics-tls" not found Apr 20 14:27:33.695899 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:33.695852 2572 generic.go:358] "Generic (PLEG): container finished" podID="9798cc7a-2b24-4609-adaf-fd0cb6fa296b" containerID="f446e8589fbf38111f9c6de3d30c9da8e874e1634b8a44ac33b5565332282ddc" exitCode=0 Apr 20 14:27:33.696730 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:33.695938 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgc2" event={"ID":"9798cc7a-2b24-4609-adaf-fd0cb6fa296b","Type":"ContainerDied","Data":"f446e8589fbf38111f9c6de3d30c9da8e874e1634b8a44ac33b5565332282ddc"} Apr 20 14:27:34.702014 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:34.701966 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgc2" event={"ID":"9798cc7a-2b24-4609-adaf-fd0cb6fa296b","Type":"ContainerStarted","Data":"f157dfba88e05efbe6c26dfc224f749e1f19da39bc1573e55b987521721d9066"} Apr 20 14:27:34.725341 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:34.725275 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9kgc2" podStartSLOduration=4.750607117 podStartE2EDuration="38.725258943s" podCreationTimestamp="2026-04-20 14:26:56 +0000 UTC" firstStartedPulling="2026-04-20 14:26:57.655418127 +0000 UTC m=+1.862410545" lastFinishedPulling="2026-04-20 14:27:31.63006995 +0000 UTC m=+35.837062371" observedRunningTime="2026-04-20 14:27:34.724102482 +0000 UTC m=+38.931094921" watchObservedRunningTime="2026-04-20 14:27:34.725258943 +0000 UTC m=+38.932251442" Apr 20 14:27:36.707738 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:36.707703 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vkzx4" event={"ID":"5fc6b1ee-788e-4792-8454-48a99a628442","Type":"ContainerStarted","Data":"9681b95ed1ada324a1bef95e7f4676dad656ec610ef25c690418338253e80755"} Apr 20 14:27:36.730899 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:36.730772 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vkzx4" podStartSLOduration=32.81175553 podStartE2EDuration="36.730752542s" podCreationTimestamp="2026-04-20 14:27:00 +0000 UTC" firstStartedPulling="2026-04-20 14:27:32.631475601 +0000 UTC m=+36.838468017" lastFinishedPulling="2026-04-20 14:27:36.550472612 +0000 UTC m=+40.757465029" observedRunningTime="2026-04-20 14:27:36.730486002 +0000 UTC m=+40.937478442" watchObservedRunningTime="2026-04-20 14:27:36.730752542 +0000 UTC m=+40.937744998" Apr 20 14:27:36.951940 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:36.951836 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert\") pod \"ingress-canary-b2q9g\" (UID: \"2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4\") " pod="openshift-ingress-canary/ingress-canary-b2q9g" Apr 20 14:27:36.951940 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:36.951894 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:27:36.952179 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:36.952000 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:36.952179 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:36.952003 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:36.952179 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:36.952055 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls podName:ed9eaae0-abf3-4e1a-89f2-760f6e63f14a nodeName:}" failed. No retries permitted until 2026-04-20 14:27:44.952038899 +0000 UTC m=+49.159031317 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls") pod "dns-default-6ln9g" (UID: "ed9eaae0-abf3-4e1a-89f2-760f6e63f14a") : secret "dns-default-metrics-tls" not found Apr 20 14:27:36.952179 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:36.952068 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert podName:2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:44.952062386 +0000 UTC m=+49.159054803 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert") pod "ingress-canary-b2q9g" (UID: "2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4") : secret "canary-serving-cert" not found Apr 20 14:27:42.096735 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.096702 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8"] Apr 20 14:27:42.139580 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.139538 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk"] Apr 20 14:27:42.139776 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.139727 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8" Apr 20 14:27:42.142547 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.142519 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 14:27:42.142672 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.142557 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 14:27:42.142672 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.142567 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 14:27:42.142672 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.142622 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 14:27:42.162096 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.162070 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8"] Apr 20 14:27:42.162096 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.162100 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk"] Apr 20 14:27:42.162265 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.162154 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.164808 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.164781 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 14:27:42.164925 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.164782 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 14:27:42.164925 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.164782 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 14:27:42.164925 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.164783 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 14:27:42.290474 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.290439 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5463f4d9-adfe-4439-bff5-2f1e0e27c141-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.290648 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.290501 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5163523-56d6-48e0-bd8e-38ebd1275699-tmp\") pod \"klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8\" (UID: \"c5163523-56d6-48e0-bd8e-38ebd1275699\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8" Apr 20 14:27:42.290648 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.290527 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5463f4d9-adfe-4439-bff5-2f1e0e27c141-hub\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.290648 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.290560 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c5163523-56d6-48e0-bd8e-38ebd1275699-klusterlet-config\") pod \"klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8\" (UID: \"c5163523-56d6-48e0-bd8e-38ebd1275699\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8" Apr 20 14:27:42.290648 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.290584 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk4xc\" (UniqueName: \"kubernetes.io/projected/c5163523-56d6-48e0-bd8e-38ebd1275699-kube-api-access-fk4xc\") pod \"klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8\" (UID: \"c5163523-56d6-48e0-bd8e-38ebd1275699\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8" Apr 20 14:27:42.290648 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.290638 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5463f4d9-adfe-4439-bff5-2f1e0e27c141-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.290807 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.290676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5463f4d9-adfe-4439-bff5-2f1e0e27c141-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.290807 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.290693 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp7c4\" (UniqueName: \"kubernetes.io/projected/5463f4d9-adfe-4439-bff5-2f1e0e27c141-kube-api-access-qp7c4\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.290807 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.290781 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5463f4d9-adfe-4439-bff5-2f1e0e27c141-ca\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.391885 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.391797 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c5163523-56d6-48e0-bd8e-38ebd1275699-klusterlet-config\") pod \"klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8\" (UID: \"c5163523-56d6-48e0-bd8e-38ebd1275699\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8" Apr 20 14:27:42.391885 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.391839 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fk4xc\" (UniqueName: \"kubernetes.io/projected/c5163523-56d6-48e0-bd8e-38ebd1275699-kube-api-access-fk4xc\") pod \"klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8\" (UID: \"c5163523-56d6-48e0-bd8e-38ebd1275699\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8" Apr 20 14:27:42.391885 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.391859 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5463f4d9-adfe-4439-bff5-2f1e0e27c141-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.391885 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.391886 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5463f4d9-adfe-4439-bff5-2f1e0e27c141-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.392236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.391906 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qp7c4\" (UniqueName: \"kubernetes.io/projected/5463f4d9-adfe-4439-bff5-2f1e0e27c141-kube-api-access-qp7c4\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.392236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.391956 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5463f4d9-adfe-4439-bff5-2f1e0e27c141-ca\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.392236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.391972 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5463f4d9-adfe-4439-bff5-2f1e0e27c141-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.392236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.391997 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5163523-56d6-48e0-bd8e-38ebd1275699-tmp\") pod \"klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8\" (UID: \"c5163523-56d6-48e0-bd8e-38ebd1275699\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8" Apr 20 14:27:42.392236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.392020 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5463f4d9-adfe-4439-bff5-2f1e0e27c141-hub\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.392609 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.392574 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5163523-56d6-48e0-bd8e-38ebd1275699-tmp\") pod \"klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8\" (UID: \"c5163523-56d6-48e0-bd8e-38ebd1275699\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8" Apr 20 14:27:42.392820 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.392778 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5463f4d9-adfe-4439-bff5-2f1e0e27c141-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.395890 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.395858 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5463f4d9-adfe-4439-bff5-2f1e0e27c141-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.396036 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.396013 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5463f4d9-adfe-4439-bff5-2f1e0e27c141-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.396865 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.396250 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c5163523-56d6-48e0-bd8e-38ebd1275699-klusterlet-config\") pod \"klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8\" (UID: \"c5163523-56d6-48e0-bd8e-38ebd1275699\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8" Apr 20 14:27:42.401482 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.401457 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk4xc\" (UniqueName: \"kubernetes.io/projected/c5163523-56d6-48e0-bd8e-38ebd1275699-kube-api-access-fk4xc\") pod \"klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8\" (UID: \"c5163523-56d6-48e0-bd8e-38ebd1275699\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8" Apr 20 14:27:42.406394 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.406369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5463f4d9-adfe-4439-bff5-2f1e0e27c141-ca\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.406394 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.406383 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5463f4d9-adfe-4439-bff5-2f1e0e27c141-hub\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.408356 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.408336 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp7c4\" (UniqueName: \"kubernetes.io/projected/5463f4d9-adfe-4439-bff5-2f1e0e27c141-kube-api-access-qp7c4\") pod \"cluster-proxy-proxy-agent-58cfc5444d-bz2qk\" (UID: \"5463f4d9-adfe-4439-bff5-2f1e0e27c141\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.449371 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.449324 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8" Apr 20 14:27:42.481583 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.481553 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:27:42.595865 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.595799 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8"] Apr 20 14:27:42.599794 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:27:42.599757 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5163523_56d6_48e0_bd8e_38ebd1275699.slice/crio-c1b5b188b0329fac93f9b313a6c61f9d9c95b5cbdc3461792ce74686f2470195 WatchSource:0}: Error finding container c1b5b188b0329fac93f9b313a6c61f9d9c95b5cbdc3461792ce74686f2470195: Status 404 returned error can't find the container with id c1b5b188b0329fac93f9b313a6c61f9d9c95b5cbdc3461792ce74686f2470195 Apr 20 14:27:42.649740 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.649663 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk"] Apr 20 14:27:42.652783 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:27:42.652759 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5463f4d9_adfe_4439_bff5_2f1e0e27c141.slice/crio-4ae1ecdeda7a21e25601fc86c2ce8269af7a3d42da004711b5b0b9afe545051a WatchSource:0}: Error finding container 4ae1ecdeda7a21e25601fc86c2ce8269af7a3d42da004711b5b0b9afe545051a: Status 404 returned error can't find the container with id 4ae1ecdeda7a21e25601fc86c2ce8269af7a3d42da004711b5b0b9afe545051a Apr 20 14:27:42.719885 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.719840 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" event={"ID":"5463f4d9-adfe-4439-bff5-2f1e0e27c141","Type":"ContainerStarted","Data":"4ae1ecdeda7a21e25601fc86c2ce8269af7a3d42da004711b5b0b9afe545051a"} Apr 20 14:27:42.720801 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:42.720766 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8" event={"ID":"c5163523-56d6-48e0-bd8e-38ebd1275699","Type":"ContainerStarted","Data":"c1b5b188b0329fac93f9b313a6c61f9d9c95b5cbdc3461792ce74686f2470195"} Apr 20 14:27:45.015760 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:45.015705 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert\") pod \"ingress-canary-b2q9g\" (UID: \"2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4\") " pod="openshift-ingress-canary/ingress-canary-b2q9g" Apr 20 14:27:45.016315 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:45.015795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:27:45.016315 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:45.015892 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:45.016315 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:45.015954 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:45.016315 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:45.016019 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls podName:ed9eaae0-abf3-4e1a-89f2-760f6e63f14a nodeName:}" failed. No retries permitted until 2026-04-20 14:28:01.01599929 +0000 UTC m=+65.222991722 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls") pod "dns-default-6ln9g" (UID: "ed9eaae0-abf3-4e1a-89f2-760f6e63f14a") : secret "dns-default-metrics-tls" not found Apr 20 14:27:45.016315 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:27:45.016040 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert podName:2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4 nodeName:}" failed. No retries permitted until 2026-04-20 14:28:01.016031211 +0000 UTC m=+65.223023627 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert") pod "ingress-canary-b2q9g" (UID: "2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4") : secret "canary-serving-cert" not found Apr 20 14:27:48.735213 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:48.735173 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" event={"ID":"5463f4d9-adfe-4439-bff5-2f1e0e27c141","Type":"ContainerStarted","Data":"bf5121f199e0e6b22e1a635ce8f55cb76fc7f06d97c43b02e64546061ef40957"} Apr 20 14:27:48.736604 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:48.736578 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8" event={"ID":"c5163523-56d6-48e0-bd8e-38ebd1275699","Type":"ContainerStarted","Data":"15fc788775a460cbb0c7179bd2d78a23237d68ecd3ad7dbc5ea59f789780551d"} Apr 20 14:27:48.736849 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:48.736822 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8" Apr 20 14:27:48.738686 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:48.738663 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8" Apr 20 14:27:48.768101 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:48.768050 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6c5d5ff5bd-vgsc8" podStartSLOduration=1.713768035 podStartE2EDuration="6.768036722s" podCreationTimestamp="2026-04-20 14:27:42 +0000 UTC" firstStartedPulling="2026-04-20 14:27:42.601761413 +0000 UTC m=+46.808753829" lastFinishedPulling="2026-04-20 14:27:47.656030099 +0000 UTC m=+51.863022516" observedRunningTime="2026-04-20 14:27:48.752313726 +0000 UTC m=+52.959306166" watchObservedRunningTime="2026-04-20 14:27:48.768036722 +0000 UTC m=+52.975029161" Apr 20 14:27:50.742824 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:50.742785 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" event={"ID":"5463f4d9-adfe-4439-bff5-2f1e0e27c141","Type":"ContainerStarted","Data":"88919017e9ca04e3ebd3299bc208aeb2f9eb3df7f2e55a323ccd037875cf143d"} Apr 20 14:27:50.742824 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:50.742827 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" event={"ID":"5463f4d9-adfe-4439-bff5-2f1e0e27c141","Type":"ContainerStarted","Data":"7ee2495e2d67464a613657743f421b0b0ef4b76280bc554d49c1bb981a54c25b"} Apr 20 14:27:50.761329 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:50.761280 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" podStartSLOduration=1.591508654 podStartE2EDuration="8.761266214s" podCreationTimestamp="2026-04-20 14:27:42 +0000 UTC" firstStartedPulling="2026-04-20 14:27:42.654480227 +0000 UTC m=+46.861472647" lastFinishedPulling="2026-04-20 14:27:49.824237787 +0000 UTC m=+54.031230207" observedRunningTime="2026-04-20 14:27:50.760546614 +0000 UTC m=+54.967539083" watchObservedRunningTime="2026-04-20 14:27:50.761266214 +0000 UTC m=+54.968258653" Apr 20 14:27:55.683135 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:27:55.683091 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x29sq" Apr 20 14:28:01.035549 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:01.035498 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert\") pod \"ingress-canary-b2q9g\" (UID: \"2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4\") " pod="openshift-ingress-canary/ingress-canary-b2q9g" Apr 20 14:28:01.036031 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:01.035572 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:28:01.036031 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:28:01.035648 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:28:01.036031 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:28:01.035679 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:28:01.036031 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:28:01.035721 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert podName:2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4 nodeName:}" failed. No retries permitted until 2026-04-20 14:28:33.035705877 +0000 UTC m=+97.242698312 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert") pod "ingress-canary-b2q9g" (UID: "2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4") : secret "canary-serving-cert" not found Apr 20 14:28:01.036031 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:28:01.035735 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls podName:ed9eaae0-abf3-4e1a-89f2-760f6e63f14a nodeName:}" failed. No retries permitted until 2026-04-20 14:28:33.035728966 +0000 UTC m=+97.242721382 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls") pod "dns-default-6ln9g" (UID: "ed9eaae0-abf3-4e1a-89f2-760f6e63f14a") : secret "dns-default-metrics-tls" not found Apr 20 14:28:01.136384 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:01.136342 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs\") pod \"network-metrics-daemon-g7tjl\" (UID: \"19b3975f-609f-427a-a428-db9cb8176eec\") " pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:28:01.139057 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:01.139039 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:28:01.146969 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:28:01.146946 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 14:28:01.147067 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:28:01.147020 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs podName:19b3975f-609f-427a-a428-db9cb8176eec nodeName:}" failed. No retries permitted until 2026-04-20 14:29:05.146999266 +0000 UTC m=+129.353991684 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs") pod "network-metrics-daemon-g7tjl" (UID: "19b3975f-609f-427a-a428-db9cb8176eec") : secret "metrics-daemon-secret" not found Apr 20 14:28:01.237646 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:01.237605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnxs\" (UniqueName: \"kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs\") pod \"network-check-target-dgqlp\" (UID: \"e64bf075-31c6-4e04-b3fc-8dddfabaff6a\") " pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:28:01.240600 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:01.240575 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 14:28:01.250903 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:01.250881 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 14:28:01.261220 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:01.261198 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhnxs\" (UniqueName: \"kubernetes.io/projected/e64bf075-31c6-4e04-b3fc-8dddfabaff6a-kube-api-access-nhnxs\") pod \"network-check-target-dgqlp\" (UID: \"e64bf075-31c6-4e04-b3fc-8dddfabaff6a\") " pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:28:01.378256 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:01.378169 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xgs6p\"" Apr 20 14:28:01.386010 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:01.385991 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:28:01.498224 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:01.498195 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dgqlp"] Apr 20 14:28:01.501663 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:28:01.501627 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode64bf075_31c6_4e04_b3fc_8dddfabaff6a.slice/crio-fd0cc71d98b08d3915a491026ee2fff060faaf88940c8bbeb76bc1f8aa82b282 WatchSource:0}: Error finding container fd0cc71d98b08d3915a491026ee2fff060faaf88940c8bbeb76bc1f8aa82b282: Status 404 returned error can't find the container with id fd0cc71d98b08d3915a491026ee2fff060faaf88940c8bbeb76bc1f8aa82b282 Apr 20 14:28:01.766265 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:01.766227 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dgqlp" event={"ID":"e64bf075-31c6-4e04-b3fc-8dddfabaff6a","Type":"ContainerStarted","Data":"fd0cc71d98b08d3915a491026ee2fff060faaf88940c8bbeb76bc1f8aa82b282"} Apr 20 14:28:04.774667 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:04.774629 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dgqlp" event={"ID":"e64bf075-31c6-4e04-b3fc-8dddfabaff6a","Type":"ContainerStarted","Data":"6b67a03b5a9f7cf4d4d20cd8f73e4f7bfbc66f71582342888c63590072184de7"} Apr 20 14:28:04.775150 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:04.774834 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:28:04.792898 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:04.792813 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-dgqlp" podStartSLOduration=65.758428843 podStartE2EDuration="1m8.792799165s" podCreationTimestamp="2026-04-20 14:26:56 +0000 UTC" firstStartedPulling="2026-04-20 14:28:01.503329481 +0000 UTC m=+65.710321898" lastFinishedPulling="2026-04-20 14:28:04.537699803 +0000 UTC m=+68.744692220" observedRunningTime="2026-04-20 14:28:04.792005997 +0000 UTC m=+68.998998437" watchObservedRunningTime="2026-04-20 14:28:04.792799165 +0000 UTC m=+68.999791604" Apr 20 14:28:30.924403 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:30.924278 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w77dh_26818813-da84-407b-b55f-77d9ffcbb474/dns-node-resolver/0.log" Apr 20 14:28:31.525032 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:31.525002 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-f759k_db0116b2-344f-40ec-a0a2-b47e8cb06248/node-ca/0.log" Apr 20 14:28:33.070811 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:33.070754 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert\") pod \"ingress-canary-b2q9g\" (UID: \"2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4\") " pod="openshift-ingress-canary/ingress-canary-b2q9g" Apr 20 14:28:33.070811 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:33.070822 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:28:33.071299 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:28:33.070904 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:28:33.071299 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:28:33.070985 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert podName:2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4 nodeName:}" failed. No retries permitted until 2026-04-20 14:29:37.070967168 +0000 UTC m=+161.277959590 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert") pod "ingress-canary-b2q9g" (UID: "2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4") : secret "canary-serving-cert" not found Apr 20 14:28:33.071299 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:28:33.070907 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:28:33.071299 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:28:33.071049 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls podName:ed9eaae0-abf3-4e1a-89f2-760f6e63f14a nodeName:}" failed. No retries permitted until 2026-04-20 14:29:37.07103708 +0000 UTC m=+161.278029497 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls") pod "dns-default-6ln9g" (UID: "ed9eaae0-abf3-4e1a-89f2-760f6e63f14a") : secret "dns-default-metrics-tls" not found Apr 20 14:28:35.780599 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:35.780564 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-dgqlp" Apr 20 14:28:51.658631 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.658597 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2jkbs"] Apr 20 14:28:51.660739 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.660720 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2jkbs" Apr 20 14:28:51.667261 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.666942 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 14:28:51.667261 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.666981 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 14:28:51.667261 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.667016 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 14:28:51.667427 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.667316 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rq5c2\"" Apr 20 14:28:51.667427 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.667326 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 14:28:51.683732 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.683703 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2jkbs"] Apr 20 14:28:51.806111 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.806076 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/78b75644-9f8e-441d-9ce9-45bc435b888f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2jkbs\" (UID: \"78b75644-9f8e-441d-9ce9-45bc435b888f\") " pod="openshift-insights/insights-runtime-extractor-2jkbs" Apr 20 14:28:51.806111 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.806113 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/78b75644-9f8e-441d-9ce9-45bc435b888f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2jkbs\" (UID: \"78b75644-9f8e-441d-9ce9-45bc435b888f\") " pod="openshift-insights/insights-runtime-extractor-2jkbs" Apr 20 14:28:51.806335 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.806216 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/78b75644-9f8e-441d-9ce9-45bc435b888f-data-volume\") pod \"insights-runtime-extractor-2jkbs\" (UID: \"78b75644-9f8e-441d-9ce9-45bc435b888f\") " pod="openshift-insights/insights-runtime-extractor-2jkbs" Apr 20 14:28:51.806335 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.806254 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4j4q\" (UniqueName: \"kubernetes.io/projected/78b75644-9f8e-441d-9ce9-45bc435b888f-kube-api-access-q4j4q\") pod \"insights-runtime-extractor-2jkbs\" (UID: \"78b75644-9f8e-441d-9ce9-45bc435b888f\") " pod="openshift-insights/insights-runtime-extractor-2jkbs" Apr 20 14:28:51.806335 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.806272 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/78b75644-9f8e-441d-9ce9-45bc435b888f-crio-socket\") pod \"insights-runtime-extractor-2jkbs\" (UID: \"78b75644-9f8e-441d-9ce9-45bc435b888f\") " pod="openshift-insights/insights-runtime-extractor-2jkbs" Apr 20 14:28:51.907222 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.907174 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/78b75644-9f8e-441d-9ce9-45bc435b888f-data-volume\") pod \"insights-runtime-extractor-2jkbs\" (UID: \"78b75644-9f8e-441d-9ce9-45bc435b888f\") " pod="openshift-insights/insights-runtime-extractor-2jkbs" Apr 20 14:28:51.907222 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.907221 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4j4q\" (UniqueName: \"kubernetes.io/projected/78b75644-9f8e-441d-9ce9-45bc435b888f-kube-api-access-q4j4q\") pod \"insights-runtime-extractor-2jkbs\" (UID: \"78b75644-9f8e-441d-9ce9-45bc435b888f\") " pod="openshift-insights/insights-runtime-extractor-2jkbs" Apr 20 14:28:51.907449 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.907244 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/78b75644-9f8e-441d-9ce9-45bc435b888f-crio-socket\") pod \"insights-runtime-extractor-2jkbs\" (UID: \"78b75644-9f8e-441d-9ce9-45bc435b888f\") " pod="openshift-insights/insights-runtime-extractor-2jkbs" Apr 20 14:28:51.907449 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.907288 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/78b75644-9f8e-441d-9ce9-45bc435b888f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2jkbs\" (UID: \"78b75644-9f8e-441d-9ce9-45bc435b888f\") " pod="openshift-insights/insights-runtime-extractor-2jkbs" Apr 20 14:28:51.907449 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.907312 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/78b75644-9f8e-441d-9ce9-45bc435b888f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2jkbs\" (UID: \"78b75644-9f8e-441d-9ce9-45bc435b888f\") " pod="openshift-insights/insights-runtime-extractor-2jkbs" Apr 20 14:28:51.907449 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.907387 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/78b75644-9f8e-441d-9ce9-45bc435b888f-crio-socket\") pod \"insights-runtime-extractor-2jkbs\" (UID: \"78b75644-9f8e-441d-9ce9-45bc435b888f\") " pod="openshift-insights/insights-runtime-extractor-2jkbs" Apr 20 14:28:51.907609 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.907586 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/78b75644-9f8e-441d-9ce9-45bc435b888f-data-volume\") pod \"insights-runtime-extractor-2jkbs\" (UID: \"78b75644-9f8e-441d-9ce9-45bc435b888f\") " pod="openshift-insights/insights-runtime-extractor-2jkbs" Apr 20 14:28:51.907819 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.907795 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/78b75644-9f8e-441d-9ce9-45bc435b888f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2jkbs\" (UID: \"78b75644-9f8e-441d-9ce9-45bc435b888f\") " pod="openshift-insights/insights-runtime-extractor-2jkbs" Apr 20 14:28:51.909763 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.909721 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/78b75644-9f8e-441d-9ce9-45bc435b888f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2jkbs\" (UID: \"78b75644-9f8e-441d-9ce9-45bc435b888f\") " pod="openshift-insights/insights-runtime-extractor-2jkbs" Apr 20 14:28:51.917396 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.917369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4j4q\" (UniqueName: \"kubernetes.io/projected/78b75644-9f8e-441d-9ce9-45bc435b888f-kube-api-access-q4j4q\") pod \"insights-runtime-extractor-2jkbs\" (UID: \"78b75644-9f8e-441d-9ce9-45bc435b888f\") " pod="openshift-insights/insights-runtime-extractor-2jkbs" Apr 20 14:28:51.969378 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:51.969340 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2jkbs" Apr 20 14:28:52.085189 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:52.085146 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2jkbs"] Apr 20 14:28:52.088576 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:28:52.088545 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78b75644_9f8e_441d_9ce9_45bc435b888f.slice/crio-6697be534ee83335b79871b6d6fcdf11e1a2d6c74ba1fe30b8a7ef42dd8b93cf WatchSource:0}: Error finding container 6697be534ee83335b79871b6d6fcdf11e1a2d6c74ba1fe30b8a7ef42dd8b93cf: Status 404 returned error can't find the container with id 6697be534ee83335b79871b6d6fcdf11e1a2d6c74ba1fe30b8a7ef42dd8b93cf Apr 20 14:28:52.887976 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:52.887938 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2jkbs" event={"ID":"78b75644-9f8e-441d-9ce9-45bc435b888f","Type":"ContainerStarted","Data":"90262892c03967c1d12a2947810159e47ea7cf28a9b16c1406e429f373613bf3"} Apr 20 14:28:52.887976 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:52.887974 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2jkbs" event={"ID":"78b75644-9f8e-441d-9ce9-45bc435b888f","Type":"ContainerStarted","Data":"20645dccf0090de0e55b3a10680b6365725b02349b0d768464b6f88935544af7"} Apr 20 14:28:52.887976 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:52.887984 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2jkbs" event={"ID":"78b75644-9f8e-441d-9ce9-45bc435b888f","Type":"ContainerStarted","Data":"6697be534ee83335b79871b6d6fcdf11e1a2d6c74ba1fe30b8a7ef42dd8b93cf"} Apr 20 14:28:54.894949 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:54.894859 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2jkbs" event={"ID":"78b75644-9f8e-441d-9ce9-45bc435b888f","Type":"ContainerStarted","Data":"46121a3fa8543cdc7635399cca4c3ed1ff499bfba9ff73ef5c3ceb608ea38fbd"} Apr 20 14:28:54.913890 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:28:54.913845 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2jkbs" podStartSLOduration=1.5636850789999999 podStartE2EDuration="3.913831086s" podCreationTimestamp="2026-04-20 14:28:51 +0000 UTC" firstStartedPulling="2026-04-20 14:28:52.136629073 +0000 UTC m=+116.343621490" lastFinishedPulling="2026-04-20 14:28:54.486775066 +0000 UTC m=+118.693767497" observedRunningTime="2026-04-20 14:28:54.912636465 +0000 UTC m=+119.119628901" watchObservedRunningTime="2026-04-20 14:28:54.913831086 +0000 UTC m=+119.120823525" Apr 20 14:29:04.668395 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.668356 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-vp8nl"] Apr 20 14:29:04.670780 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.670758 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.673762 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.673730 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 14:29:04.673762 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.673736 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 14:29:04.674087 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.674067 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 14:29:04.674190 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.674101 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 14:29:04.674508 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.674489 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 14:29:04.674841 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.674825 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-vhd9x\"" Apr 20 14:29:04.674922 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.674861 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 14:29:04.806200 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.806162 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/41c9c8fd-f140-48b8-95f6-02067151be5a-node-exporter-wtmp\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.806200 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.806204 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/41c9c8fd-f140-48b8-95f6-02067151be5a-root\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.806396 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.806240 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxsjt\" (UniqueName: \"kubernetes.io/projected/41c9c8fd-f140-48b8-95f6-02067151be5a-kube-api-access-vxsjt\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.806396 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.806274 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/41c9c8fd-f140-48b8-95f6-02067151be5a-node-exporter-accelerators-collector-config\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.806396 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.806339 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/41c9c8fd-f140-48b8-95f6-02067151be5a-node-exporter-tls\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.806396 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.806361 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41c9c8fd-f140-48b8-95f6-02067151be5a-sys\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.806396 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.806377 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41c9c8fd-f140-48b8-95f6-02067151be5a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.806396 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.806393 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41c9c8fd-f140-48b8-95f6-02067151be5a-metrics-client-ca\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.806577 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.806412 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/41c9c8fd-f140-48b8-95f6-02067151be5a-node-exporter-textfile\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.907218 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.907182 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/41c9c8fd-f140-48b8-95f6-02067151be5a-node-exporter-accelerators-collector-config\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.907404 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.907238 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/41c9c8fd-f140-48b8-95f6-02067151be5a-node-exporter-tls\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.907404 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.907271 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41c9c8fd-f140-48b8-95f6-02067151be5a-sys\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.907404 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.907289 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41c9c8fd-f140-48b8-95f6-02067151be5a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.907404 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.907307 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41c9c8fd-f140-48b8-95f6-02067151be5a-metrics-client-ca\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.907404 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.907332 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/41c9c8fd-f140-48b8-95f6-02067151be5a-node-exporter-textfile\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.907404 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.907373 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/41c9c8fd-f140-48b8-95f6-02067151be5a-node-exporter-wtmp\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.907718 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.907373 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41c9c8fd-f140-48b8-95f6-02067151be5a-sys\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.907718 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.907478 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/41c9c8fd-f140-48b8-95f6-02067151be5a-root\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.907718 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.907521 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/41c9c8fd-f140-48b8-95f6-02067151be5a-node-exporter-wtmp\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.907718 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.907529 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxsjt\" (UniqueName: \"kubernetes.io/projected/41c9c8fd-f140-48b8-95f6-02067151be5a-kube-api-access-vxsjt\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.907718 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.907637 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/41c9c8fd-f140-48b8-95f6-02067151be5a-root\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.907941 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.907811 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/41c9c8fd-f140-48b8-95f6-02067151be5a-node-exporter-textfile\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.907941 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.907883 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41c9c8fd-f140-48b8-95f6-02067151be5a-metrics-client-ca\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.907941 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.907882 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/41c9c8fd-f140-48b8-95f6-02067151be5a-node-exporter-accelerators-collector-config\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.909672 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.909655 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/41c9c8fd-f140-48b8-95f6-02067151be5a-node-exporter-tls\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.909780 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.909719 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41c9c8fd-f140-48b8-95f6-02067151be5a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.916247 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.916222 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxsjt\" (UniqueName: \"kubernetes.io/projected/41c9c8fd-f140-48b8-95f6-02067151be5a-kube-api-access-vxsjt\") pod \"node-exporter-vp8nl\" (UID: \"41c9c8fd-f140-48b8-95f6-02067151be5a\") " pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.980065 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:04.980036 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vp8nl" Apr 20 14:29:04.987778 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:29:04.987747 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41c9c8fd_f140_48b8_95f6_02067151be5a.slice/crio-6c1a1ce8e9ff35948a7825c3e7978f048899ddf7b1ace86527156af3e665d9e7 WatchSource:0}: Error finding container 6c1a1ce8e9ff35948a7825c3e7978f048899ddf7b1ace86527156af3e665d9e7: Status 404 returned error can't find the container with id 6c1a1ce8e9ff35948a7825c3e7978f048899ddf7b1ace86527156af3e665d9e7 Apr 20 14:29:05.210305 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:05.210250 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs\") pod \"network-metrics-daemon-g7tjl\" (UID: \"19b3975f-609f-427a-a428-db9cb8176eec\") " pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:29:05.212559 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:05.212538 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19b3975f-609f-427a-a428-db9cb8176eec-metrics-certs\") pod \"network-metrics-daemon-g7tjl\" (UID: \"19b3975f-609f-427a-a428-db9cb8176eec\") " pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:29:05.481435 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:05.481401 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xmk8q\"" Apr 20 14:29:05.488692 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:05.488659 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7tjl" Apr 20 14:29:05.622566 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:05.622535 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g7tjl"] Apr 20 14:29:05.698510 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:29:05.698470 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19b3975f_609f_427a_a428_db9cb8176eec.slice/crio-b177fc860b33dba0a387190d72a8d3ceca1a2c73999fa576f247d1e001934056 WatchSource:0}: Error finding container b177fc860b33dba0a387190d72a8d3ceca1a2c73999fa576f247d1e001934056: Status 404 returned error can't find the container with id b177fc860b33dba0a387190d72a8d3ceca1a2c73999fa576f247d1e001934056 Apr 20 14:29:05.923303 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:05.923262 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vp8nl" event={"ID":"41c9c8fd-f140-48b8-95f6-02067151be5a","Type":"ContainerStarted","Data":"cb6fbf74d59f7b93469448d7d93b72cc0f188b189610cfd00a2e6ef6ecd8a28c"} Apr 20 14:29:05.923464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:05.923308 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vp8nl" event={"ID":"41c9c8fd-f140-48b8-95f6-02067151be5a","Type":"ContainerStarted","Data":"6c1a1ce8e9ff35948a7825c3e7978f048899ddf7b1ace86527156af3e665d9e7"} Apr 20 14:29:05.924410 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:05.924380 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g7tjl" event={"ID":"19b3975f-609f-427a-a428-db9cb8176eec","Type":"ContainerStarted","Data":"b177fc860b33dba0a387190d72a8d3ceca1a2c73999fa576f247d1e001934056"} Apr 20 14:29:06.928756 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:06.928666 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g7tjl" event={"ID":"19b3975f-609f-427a-a428-db9cb8176eec","Type":"ContainerStarted","Data":"612b8d2fe030972256a9b17711c25ad19d0ebd167d6ca9698cbdd553c3fc6ebd"} Apr 20 14:29:06.928756 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:06.928706 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g7tjl" event={"ID":"19b3975f-609f-427a-a428-db9cb8176eec","Type":"ContainerStarted","Data":"a5647ca440fcb13a345d0f2e13d171e64ea51fb80be435cdf8ad507922924edd"} Apr 20 14:29:06.929862 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:06.929839 2572 generic.go:358] "Generic (PLEG): container finished" podID="41c9c8fd-f140-48b8-95f6-02067151be5a" containerID="cb6fbf74d59f7b93469448d7d93b72cc0f188b189610cfd00a2e6ef6ecd8a28c" exitCode=0 Apr 20 14:29:06.929963 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:06.929873 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vp8nl" event={"ID":"41c9c8fd-f140-48b8-95f6-02067151be5a","Type":"ContainerDied","Data":"cb6fbf74d59f7b93469448d7d93b72cc0f188b189610cfd00a2e6ef6ecd8a28c"} Apr 20 14:29:06.946554 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:06.946135 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-g7tjl" podStartSLOduration=130.066912713 podStartE2EDuration="2m10.946104197s" podCreationTimestamp="2026-04-20 14:26:56 +0000 UTC" firstStartedPulling="2026-04-20 14:29:05.700292551 +0000 UTC m=+129.907284968" lastFinishedPulling="2026-04-20 14:29:06.579484036 +0000 UTC m=+130.786476452" observedRunningTime="2026-04-20 14:29:06.945918746 +0000 UTC m=+131.152911191" watchObservedRunningTime="2026-04-20 14:29:06.946104197 +0000 UTC m=+131.153096637" Apr 20 14:29:07.899105 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:07.899073 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-8445558856-t2vgl"] Apr 20 14:29:07.902266 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:07.902237 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:07.909889 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:07.909857 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 14:29:07.910019 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:07.909926 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 14:29:07.910236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:07.910211 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-8p39aasa25dv1\"" Apr 20 14:29:07.910236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:07.910220 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-rfgnk\"" Apr 20 14:29:07.910759 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:07.910740 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 14:29:07.910865 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:07.910790 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 14:29:07.912698 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:07.912680 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 14:29:07.925174 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:07.925150 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-8445558856-t2vgl"] Apr 20 14:29:07.934864 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:07.934833 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vp8nl" event={"ID":"41c9c8fd-f140-48b8-95f6-02067151be5a","Type":"ContainerStarted","Data":"8bb30452c53e65f922c1ba50a8bcdadf2dbb9564cda84f11081b0887c162c01e"} Apr 20 14:29:07.935296 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:07.934872 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vp8nl" event={"ID":"41c9c8fd-f140-48b8-95f6-02067151be5a","Type":"ContainerStarted","Data":"46c579d2226735f1ebe2bbbb751994faaaa112ad554971c929272501d4b320f0"} Apr 20 14:29:07.992956 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:07.992905 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-vp8nl" podStartSLOduration=3.251053429 podStartE2EDuration="3.992886541s" podCreationTimestamp="2026-04-20 14:29:04 +0000 UTC" firstStartedPulling="2026-04-20 14:29:04.989332119 +0000 UTC m=+129.196324537" lastFinishedPulling="2026-04-20 14:29:05.731165229 +0000 UTC m=+129.938157649" observedRunningTime="2026-04-20 14:29:07.991741022 +0000 UTC m=+132.198733461" watchObservedRunningTime="2026-04-20 14:29:07.992886541 +0000 UTC m=+132.199878962" Apr 20 14:29:08.034838 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.034800 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45644160-70a2-4887-b258-6c0a50fd530c-metrics-client-ca\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.035002 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.034862 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.035002 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.034942 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.035075 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.035054 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67nxn\" (UniqueName: \"kubernetes.io/projected/45644160-70a2-4887-b258-6c0a50fd530c-kube-api-access-67nxn\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.035220 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.035201 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-thanos-querier-tls\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.035282 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.035242 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.035331 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.035280 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.035376 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.035339 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-grpc-tls\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.135878 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.135843 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.135878 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.135889 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.136203 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.136172 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67nxn\" (UniqueName: \"kubernetes.io/projected/45644160-70a2-4887-b258-6c0a50fd530c-kube-api-access-67nxn\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.136358 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.136223 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-thanos-querier-tls\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.136358 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.136289 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.136358 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.136314 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.136358 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.136343 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-grpc-tls\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.136569 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.136430 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45644160-70a2-4887-b258-6c0a50fd530c-metrics-client-ca\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.137181 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.137160 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45644160-70a2-4887-b258-6c0a50fd530c-metrics-client-ca\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.139057 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.139029 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.139163 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.139053 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.139163 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.139090 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.139268 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.139206 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.139268 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.139218 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-thanos-querier-tls\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.139637 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.139590 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/45644160-70a2-4887-b258-6c0a50fd530c-secret-grpc-tls\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.146468 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.146442 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67nxn\" (UniqueName: \"kubernetes.io/projected/45644160-70a2-4887-b258-6c0a50fd530c-kube-api-access-67nxn\") pod \"thanos-querier-8445558856-t2vgl\" (UID: \"45644160-70a2-4887-b258-6c0a50fd530c\") " pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.211175 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.211118 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:08.349191 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.349040 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-8445558856-t2vgl"] Apr 20 14:29:08.351701 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:29:08.351657 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45644160_70a2_4887_b258_6c0a50fd530c.slice/crio-71a54d24a9032b0615b4b04bd393aed02ac3c82c324e8bd7c9550edbad8ab6e6 WatchSource:0}: Error finding container 71a54d24a9032b0615b4b04bd393aed02ac3c82c324e8bd7c9550edbad8ab6e6: Status 404 returned error can't find the container with id 71a54d24a9032b0615b4b04bd393aed02ac3c82c324e8bd7c9550edbad8ab6e6 Apr 20 14:29:08.938564 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:08.938524 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" event={"ID":"45644160-70a2-4887-b258-6c0a50fd530c","Type":"ContainerStarted","Data":"71a54d24a9032b0615b4b04bd393aed02ac3c82c324e8bd7c9550edbad8ab6e6"} Apr 20 14:29:09.009659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.009620 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-66f6c54c88-wz6n8"] Apr 20 14:29:09.012587 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.012569 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.016000 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.015967 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-jg5n2\"" Apr 20 14:29:09.016159 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.016026 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 14:29:09.016159 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.015968 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 14:29:09.016159 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.016065 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-45ivi3ss2470c\"" Apr 20 14:29:09.016159 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.016110 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 14:29:09.016389 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.016152 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 14:29:09.023155 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.023118 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-66f6c54c88-wz6n8"] Apr 20 14:29:09.143975 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.143936 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/94161a15-8d5c-42b3-b76b-d8199963c145-audit-log\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.144188 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.144002 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/94161a15-8d5c-42b3-b76b-d8199963c145-secret-metrics-server-tls\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.144188 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.144064 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/94161a15-8d5c-42b3-b76b-d8199963c145-metrics-server-audit-profiles\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.144188 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.144089 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/94161a15-8d5c-42b3-b76b-d8199963c145-secret-metrics-server-client-certs\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.144188 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.144114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48qc9\" (UniqueName: \"kubernetes.io/projected/94161a15-8d5c-42b3-b76b-d8199963c145-kube-api-access-48qc9\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.144188 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.144147 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94161a15-8d5c-42b3-b76b-d8199963c145-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.144188 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.144183 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94161a15-8d5c-42b3-b76b-d8199963c145-client-ca-bundle\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.245299 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.245268 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94161a15-8d5c-42b3-b76b-d8199963c145-client-ca-bundle\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.245480 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.245330 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/94161a15-8d5c-42b3-b76b-d8199963c145-audit-log\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.245480 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.245396 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/94161a15-8d5c-42b3-b76b-d8199963c145-secret-metrics-server-tls\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.245480 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.245445 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/94161a15-8d5c-42b3-b76b-d8199963c145-metrics-server-audit-profiles\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.245480 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.245475 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/94161a15-8d5c-42b3-b76b-d8199963c145-secret-metrics-server-client-certs\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.245692 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.245502 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48qc9\" (UniqueName: \"kubernetes.io/projected/94161a15-8d5c-42b3-b76b-d8199963c145-kube-api-access-48qc9\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.245692 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.245531 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94161a15-8d5c-42b3-b76b-d8199963c145-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.245797 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.245756 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/94161a15-8d5c-42b3-b76b-d8199963c145-audit-log\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.246259 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.246234 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94161a15-8d5c-42b3-b76b-d8199963c145-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.246842 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.246807 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/94161a15-8d5c-42b3-b76b-d8199963c145-metrics-server-audit-profiles\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.248485 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.248458 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/94161a15-8d5c-42b3-b76b-d8199963c145-secret-metrics-server-client-certs\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.248598 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.248552 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/94161a15-8d5c-42b3-b76b-d8199963c145-secret-metrics-server-tls\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.248663 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.248633 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94161a15-8d5c-42b3-b76b-d8199963c145-client-ca-bundle\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.254483 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.254462 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48qc9\" (UniqueName: \"kubernetes.io/projected/94161a15-8d5c-42b3-b76b-d8199963c145-kube-api-access-48qc9\") pod \"metrics-server-66f6c54c88-wz6n8\" (UID: \"94161a15-8d5c-42b3-b76b-d8199963c145\") " pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.322612 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.322572 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:09.462093 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.462055 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-66f6c54c88-wz6n8"] Apr 20 14:29:09.468567 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:29:09.468522 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94161a15_8d5c_42b3_b76b_d8199963c145.slice/crio-5118874730fb85d123b60d51f373711d973de09650680e06582af1a7dcd099d3 WatchSource:0}: Error finding container 5118874730fb85d123b60d51f373711d973de09650680e06582af1a7dcd099d3: Status 404 returned error can't find the container with id 5118874730fb85d123b60d51f373711d973de09650680e06582af1a7dcd099d3 Apr 20 14:29:09.943006 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:09.942967 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" event={"ID":"94161a15-8d5c-42b3-b76b-d8199963c145","Type":"ContainerStarted","Data":"5118874730fb85d123b60d51f373711d973de09650680e06582af1a7dcd099d3"} Apr 20 14:29:11.950155 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:11.950092 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" event={"ID":"94161a15-8d5c-42b3-b76b-d8199963c145","Type":"ContainerStarted","Data":"c8006c78771ba7fe54e99b01c2e4ac3e50c635453092b61c538dd3fa0e5df4c2"} Apr 20 14:29:11.951997 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:11.951973 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" event={"ID":"45644160-70a2-4887-b258-6c0a50fd530c","Type":"ContainerStarted","Data":"a2f2c3250d4f5c87a7510b717517d3e252582404d398fe68bdee4d277c9cfed6"} Apr 20 14:29:11.951997 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:11.952001 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" event={"ID":"45644160-70a2-4887-b258-6c0a50fd530c","Type":"ContainerStarted","Data":"618b86d024cf16d66de6d3224e99ae1ddc47ee1c7df2b33ce6ca457d19c609d2"} Apr 20 14:29:11.952180 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:11.952010 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" event={"ID":"45644160-70a2-4887-b258-6c0a50fd530c","Type":"ContainerStarted","Data":"192eb9373a7d6a3447e6c59b0d212a3e3571f48cf0d4c89ae4f2e8607a0f5372"} Apr 20 14:29:11.967796 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:11.967749 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" podStartSLOduration=2.307809085 podStartE2EDuration="3.967731269s" podCreationTimestamp="2026-04-20 14:29:08 +0000 UTC" firstStartedPulling="2026-04-20 14:29:09.470973048 +0000 UTC m=+133.677965470" lastFinishedPulling="2026-04-20 14:29:11.130895235 +0000 UTC m=+135.337887654" observedRunningTime="2026-04-20 14:29:11.96729699 +0000 UTC m=+136.174289467" watchObservedRunningTime="2026-04-20 14:29:11.967731269 +0000 UTC m=+136.174723712" Apr 20 14:29:12.483336 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:12.483287 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" podUID="5463f4d9-adfe-4439-bff5-2f1e0e27c141" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 14:29:12.958935 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:12.958891 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" event={"ID":"45644160-70a2-4887-b258-6c0a50fd530c","Type":"ContainerStarted","Data":"ba89f971b20af757c805a841aa31f75a6c8924b472f6449fded86cba7e95fd56"} Apr 20 14:29:12.959421 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:12.958943 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" event={"ID":"45644160-70a2-4887-b258-6c0a50fd530c","Type":"ContainerStarted","Data":"44d89df23cd003dea408892547776208a1ad7673c1973a5e49e8cb01c98887b0"} Apr 20 14:29:12.959421 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:12.958957 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" event={"ID":"45644160-70a2-4887-b258-6c0a50fd530c","Type":"ContainerStarted","Data":"4e50c6c100d7ba876679ba2672848336e2627377ad8bb3c0f6283230f4e60272"} Apr 20 14:29:12.959421 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:12.959000 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:13.006525 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:13.006468 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" podStartSLOduration=2.054957378 podStartE2EDuration="6.006451648s" podCreationTimestamp="2026-04-20 14:29:07 +0000 UTC" firstStartedPulling="2026-04-20 14:29:08.353543871 +0000 UTC m=+132.560536288" lastFinishedPulling="2026-04-20 14:29:12.305038138 +0000 UTC m=+136.512030558" observedRunningTime="2026-04-20 14:29:13.003957285 +0000 UTC m=+137.210949724" watchObservedRunningTime="2026-04-20 14:29:13.006451648 +0000 UTC m=+137.213444086" Apr 20 14:29:18.967707 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:18.967676 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-8445558856-t2vgl" Apr 20 14:29:22.482930 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:22.482886 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" podUID="5463f4d9-adfe-4439-bff5-2f1e0e27c141" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 14:29:29.323466 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:29.323415 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:29.323466 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:29.323477 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:32.173347 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:29:32.173297 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6ln9g" podUID="ed9eaae0-abf3-4e1a-89f2-760f6e63f14a" Apr 20 14:29:32.185488 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:29:32.185453 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-b2q9g" podUID="2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4" Apr 20 14:29:32.482960 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:32.482917 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" podUID="5463f4d9-adfe-4439-bff5-2f1e0e27c141" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 14:29:32.483143 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:32.482994 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" Apr 20 14:29:32.483513 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:32.483481 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"88919017e9ca04e3ebd3299bc208aeb2f9eb3df7f2e55a323ccd037875cf143d"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 14:29:32.483565 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:32.483550 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" podUID="5463f4d9-adfe-4439-bff5-2f1e0e27c141" containerName="service-proxy" containerID="cri-o://88919017e9ca04e3ebd3299bc208aeb2f9eb3df7f2e55a323ccd037875cf143d" gracePeriod=30 Apr 20 14:29:33.012166 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:33.012114 2572 generic.go:358] "Generic (PLEG): container finished" podID="5463f4d9-adfe-4439-bff5-2f1e0e27c141" containerID="88919017e9ca04e3ebd3299bc208aeb2f9eb3df7f2e55a323ccd037875cf143d" exitCode=2 Apr 20 14:29:33.012166 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:33.012153 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" event={"ID":"5463f4d9-adfe-4439-bff5-2f1e0e27c141","Type":"ContainerDied","Data":"88919017e9ca04e3ebd3299bc208aeb2f9eb3df7f2e55a323ccd037875cf143d"} Apr 20 14:29:33.012364 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:33.012190 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-58cfc5444d-bz2qk" event={"ID":"5463f4d9-adfe-4439-bff5-2f1e0e27c141","Type":"ContainerStarted","Data":"e7074777c30a036796cfd89ed5c97509ed709a2ecc39a64cf7188fd4559d2266"} Apr 20 14:29:33.012364 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:33.012258 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b2q9g" Apr 20 14:29:33.012517 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:33.012502 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6ln9g" Apr 20 14:29:37.076244 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:37.076204 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert\") pod \"ingress-canary-b2q9g\" (UID: \"2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4\") " pod="openshift-ingress-canary/ingress-canary-b2q9g" Apr 20 14:29:37.076244 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:37.076249 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:29:37.078644 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:37.078616 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed9eaae0-abf3-4e1a-89f2-760f6e63f14a-metrics-tls\") pod \"dns-default-6ln9g\" (UID: \"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a\") " pod="openshift-dns/dns-default-6ln9g" Apr 20 14:29:37.078749 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:37.078646 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4-cert\") pod \"ingress-canary-b2q9g\" (UID: \"2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4\") " pod="openshift-ingress-canary/ingress-canary-b2q9g" Apr 20 14:29:37.215745 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:37.215700 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5k2mg\"" Apr 20 14:29:37.216482 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:37.216465 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-79jcr\"" Apr 20 14:29:37.223392 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:37.223363 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b2q9g" Apr 20 14:29:37.223508 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:37.223490 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6ln9g" Apr 20 14:29:37.350909 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:37.350882 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6ln9g"] Apr 20 14:29:37.353585 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:29:37.353556 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded9eaae0_abf3_4e1a_89f2_760f6e63f14a.slice/crio-7d0727fd5fda128aa25f73ca4fb34255e38454e9dd3c72762b42c4ba0f761270 WatchSource:0}: Error finding container 7d0727fd5fda128aa25f73ca4fb34255e38454e9dd3c72762b42c4ba0f761270: Status 404 returned error can't find the container with id 7d0727fd5fda128aa25f73ca4fb34255e38454e9dd3c72762b42c4ba0f761270 Apr 20 14:29:37.375850 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:37.375823 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b2q9g"] Apr 20 14:29:37.378496 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:29:37.378470 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e1b2d6d_33b7_4a6a_97a6_47d7a6c938e4.slice/crio-c7f6515b97d03d12241c91ef64d3f710720de3ea29311964dbc15a4b6aa554bd WatchSource:0}: Error finding container c7f6515b97d03d12241c91ef64d3f710720de3ea29311964dbc15a4b6aa554bd: Status 404 returned error can't find the container with id c7f6515b97d03d12241c91ef64d3f710720de3ea29311964dbc15a4b6aa554bd Apr 20 14:29:38.028038 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:38.027992 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b2q9g" event={"ID":"2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4","Type":"ContainerStarted","Data":"c7f6515b97d03d12241c91ef64d3f710720de3ea29311964dbc15a4b6aa554bd"} Apr 20 14:29:38.029522 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:38.029492 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6ln9g" event={"ID":"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a","Type":"ContainerStarted","Data":"7d0727fd5fda128aa25f73ca4fb34255e38454e9dd3c72762b42c4ba0f761270"} Apr 20 14:29:41.039347 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:41.039305 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b2q9g" event={"ID":"2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4","Type":"ContainerStarted","Data":"8d13db1c1bcba37a022ecb9eb705a93f9f73f200fe0b928704d652423fb8b936"} Apr 20 14:29:41.040803 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:41.040777 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6ln9g" event={"ID":"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a","Type":"ContainerStarted","Data":"5b46a58c49ffadb7e1d564f2ba01ecfd3a8ef4850fe32c35d207fa5463bde69f"} Apr 20 14:29:41.040918 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:41.040807 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6ln9g" event={"ID":"ed9eaae0-abf3-4e1a-89f2-760f6e63f14a","Type":"ContainerStarted","Data":"a9d045eb0e9310f6bf9657d19583a62871d99ba1b04f09a64a04b3b1e74990cf"} Apr 20 14:29:41.040918 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:41.040899 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6ln9g" Apr 20 14:29:41.056698 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:41.056575 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b2q9g" podStartSLOduration=129.36998473 podStartE2EDuration="2m12.056556794s" podCreationTimestamp="2026-04-20 14:27:29 +0000 UTC" firstStartedPulling="2026-04-20 14:29:37.380341455 +0000 UTC m=+161.587333876" lastFinishedPulling="2026-04-20 14:29:40.066913523 +0000 UTC m=+164.273905940" observedRunningTime="2026-04-20 14:29:41.056004883 +0000 UTC m=+165.262997322" watchObservedRunningTime="2026-04-20 14:29:41.056556794 +0000 UTC m=+165.263549234" Apr 20 14:29:41.076941 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:41.076894 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6ln9g" podStartSLOduration=129.370059583 podStartE2EDuration="2m12.076879718s" podCreationTimestamp="2026-04-20 14:27:29 +0000 UTC" firstStartedPulling="2026-04-20 14:29:37.355451908 +0000 UTC m=+161.562444326" lastFinishedPulling="2026-04-20 14:29:40.062272044 +0000 UTC m=+164.269264461" observedRunningTime="2026-04-20 14:29:41.075107456 +0000 UTC m=+165.282099896" watchObservedRunningTime="2026-04-20 14:29:41.076879718 +0000 UTC m=+165.283872191" Apr 20 14:29:44.159558 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:44.159521 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6ln9g_ed9eaae0-abf3-4e1a-89f2-760f6e63f14a/dns/0.log" Apr 20 14:29:44.359263 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:44.359234 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6ln9g_ed9eaae0-abf3-4e1a-89f2-760f6e63f14a/kube-rbac-proxy/0.log" Apr 20 14:29:45.358423 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:45.358346 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w77dh_26818813-da84-407b-b55f-77d9ffcbb474/dns-node-resolver/0.log" Apr 20 14:29:45.960366 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:45.960334 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-b2q9g_2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4/serve-healthcheck-canary/0.log" Apr 20 14:29:49.328684 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:49.328647 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:49.332416 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:49.332395 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-66f6c54c88-wz6n8" Apr 20 14:29:51.046834 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:29:51.046802 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6ln9g" Apr 20 14:31:56.350207 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:31:56.350169 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 14:31:56.350753 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:31:56.350573 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 14:31:56.356909 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:31:56.356875 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 14:34:16.073230 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:16.073147 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-ssq7m"] Apr 20 14:34:16.075108 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:16.075092 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-ssq7m" Apr 20 14:34:16.077595 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:16.077575 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 14:34:16.077766 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:16.077748 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:34:16.077939 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:16.077925 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-zk456\"" Apr 20 14:34:16.087269 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:16.087246 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-ssq7m"] Apr 20 14:34:16.189166 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:16.189118 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4740abe0-01e2-407c-939a-93441e51a9e9-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-ssq7m\" (UID: \"4740abe0-01e2-407c-939a-93441e51a9e9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-ssq7m" Apr 20 14:34:16.189316 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:16.189218 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45s2g\" (UniqueName: \"kubernetes.io/projected/4740abe0-01e2-407c-939a-93441e51a9e9-kube-api-access-45s2g\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-ssq7m\" (UID: \"4740abe0-01e2-407c-939a-93441e51a9e9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-ssq7m" Apr 20 14:34:16.289784 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:16.289746 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45s2g\" (UniqueName: \"kubernetes.io/projected/4740abe0-01e2-407c-939a-93441e51a9e9-kube-api-access-45s2g\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-ssq7m\" (UID: \"4740abe0-01e2-407c-939a-93441e51a9e9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-ssq7m" Apr 20 14:34:16.289967 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:16.289801 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4740abe0-01e2-407c-939a-93441e51a9e9-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-ssq7m\" (UID: \"4740abe0-01e2-407c-939a-93441e51a9e9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-ssq7m" Apr 20 14:34:16.290232 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:16.290209 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4740abe0-01e2-407c-939a-93441e51a9e9-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-ssq7m\" (UID: \"4740abe0-01e2-407c-939a-93441e51a9e9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-ssq7m" Apr 20 14:34:16.298708 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:16.298680 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45s2g\" (UniqueName: \"kubernetes.io/projected/4740abe0-01e2-407c-939a-93441e51a9e9-kube-api-access-45s2g\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-ssq7m\" (UID: \"4740abe0-01e2-407c-939a-93441e51a9e9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-ssq7m" Apr 20 14:34:16.384178 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:16.384093 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-ssq7m" Apr 20 14:34:16.506654 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:16.506625 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-ssq7m"] Apr 20 14:34:16.510257 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:34:16.510234 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4740abe0_01e2_407c_939a_93441e51a9e9.slice/crio-ca56795d1c4e350b69e17766052a3041cf012f8de35bac89e7689aeeeb422365 WatchSource:0}: Error finding container ca56795d1c4e350b69e17766052a3041cf012f8de35bac89e7689aeeeb422365: Status 404 returned error can't find the container with id ca56795d1c4e350b69e17766052a3041cf012f8de35bac89e7689aeeeb422365 Apr 20 14:34:16.512598 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:16.512580 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:34:16.747438 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:16.747405 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-ssq7m" event={"ID":"4740abe0-01e2-407c-939a-93441e51a9e9","Type":"ContainerStarted","Data":"ca56795d1c4e350b69e17766052a3041cf012f8de35bac89e7689aeeeb422365"} Apr 20 14:34:19.757952 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:19.757914 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-ssq7m" event={"ID":"4740abe0-01e2-407c-939a-93441e51a9e9","Type":"ContainerStarted","Data":"3b30f9d5fd1e2f4ef269cdaca5b48e9f355f0b05ac00b824e6157ce8b47ccab9"} Apr 20 14:34:19.780453 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:19.780259 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-ssq7m" podStartSLOduration=1.235746518 podStartE2EDuration="3.780243082s" podCreationTimestamp="2026-04-20 14:34:16 +0000 UTC" firstStartedPulling="2026-04-20 14:34:16.51273731 +0000 UTC m=+440.719729731" lastFinishedPulling="2026-04-20 14:34:19.057233863 +0000 UTC m=+443.264226295" observedRunningTime="2026-04-20 14:34:19.778571754 +0000 UTC m=+443.985564195" watchObservedRunningTime="2026-04-20 14:34:19.780243082 +0000 UTC m=+443.987235524" Apr 20 14:34:32.560997 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:32.560950 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-m8tbv"] Apr 20 14:34:32.565827 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:32.565800 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m8tbv" Apr 20 14:34:32.568463 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:32.568443 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-cxrxd\"" Apr 20 14:34:32.569528 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:32.569513 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:34:32.569634 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:32.569617 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 14:34:32.572420 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:32.572391 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-m8tbv"] Apr 20 14:34:32.724705 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:32.724674 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zsqg\" (UniqueName: \"kubernetes.io/projected/d59a3530-dca6-42a2-a64e-9c2a300e2fe9-kube-api-access-2zsqg\") pod \"openshift-lws-operator-bfc7f696d-m8tbv\" (UID: \"d59a3530-dca6-42a2-a64e-9c2a300e2fe9\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m8tbv" Apr 20 14:34:32.724877 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:32.724727 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d59a3530-dca6-42a2-a64e-9c2a300e2fe9-tmp\") pod \"openshift-lws-operator-bfc7f696d-m8tbv\" (UID: \"d59a3530-dca6-42a2-a64e-9c2a300e2fe9\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m8tbv" Apr 20 14:34:32.825599 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:32.825512 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zsqg\" (UniqueName: \"kubernetes.io/projected/d59a3530-dca6-42a2-a64e-9c2a300e2fe9-kube-api-access-2zsqg\") pod \"openshift-lws-operator-bfc7f696d-m8tbv\" (UID: \"d59a3530-dca6-42a2-a64e-9c2a300e2fe9\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m8tbv" Apr 20 14:34:32.825599 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:32.825565 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d59a3530-dca6-42a2-a64e-9c2a300e2fe9-tmp\") pod \"openshift-lws-operator-bfc7f696d-m8tbv\" (UID: \"d59a3530-dca6-42a2-a64e-9c2a300e2fe9\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m8tbv" Apr 20 14:34:32.825940 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:32.825921 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d59a3530-dca6-42a2-a64e-9c2a300e2fe9-tmp\") pod \"openshift-lws-operator-bfc7f696d-m8tbv\" (UID: \"d59a3530-dca6-42a2-a64e-9c2a300e2fe9\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m8tbv" Apr 20 14:34:32.844738 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:32.844706 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zsqg\" (UniqueName: \"kubernetes.io/projected/d59a3530-dca6-42a2-a64e-9c2a300e2fe9-kube-api-access-2zsqg\") pod \"openshift-lws-operator-bfc7f696d-m8tbv\" (UID: \"d59a3530-dca6-42a2-a64e-9c2a300e2fe9\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m8tbv" Apr 20 14:34:32.875683 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:32.875647 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m8tbv" Apr 20 14:34:32.999824 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:32.999799 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-m8tbv"] Apr 20 14:34:33.002142 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:34:33.002098 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd59a3530_dca6_42a2_a64e_9c2a300e2fe9.slice/crio-953e7ebf2890ed80441003f5e94b5fceb536b6074bcf16464150a01eef4a5fa5 WatchSource:0}: Error finding container 953e7ebf2890ed80441003f5e94b5fceb536b6074bcf16464150a01eef4a5fa5: Status 404 returned error can't find the container with id 953e7ebf2890ed80441003f5e94b5fceb536b6074bcf16464150a01eef4a5fa5 Apr 20 14:34:33.797943 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:33.797892 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m8tbv" event={"ID":"d59a3530-dca6-42a2-a64e-9c2a300e2fe9","Type":"ContainerStarted","Data":"953e7ebf2890ed80441003f5e94b5fceb536b6074bcf16464150a01eef4a5fa5"} Apr 20 14:34:35.805887 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:35.805849 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m8tbv" event={"ID":"d59a3530-dca6-42a2-a64e-9c2a300e2fe9","Type":"ContainerStarted","Data":"4dd1d462cf33f57097446465d082f153980ca4da43aab5a2e8da8368676d365f"} Apr 20 14:34:35.823016 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:35.822953 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m8tbv" podStartSLOduration=1.5324138710000001 podStartE2EDuration="3.822937817s" podCreationTimestamp="2026-04-20 14:34:32 +0000 UTC" firstStartedPulling="2026-04-20 14:34:33.003491285 +0000 UTC m=+457.210483702" lastFinishedPulling="2026-04-20 14:34:35.29401523 +0000 UTC m=+459.501007648" observedRunningTime="2026-04-20 14:34:35.822081236 +0000 UTC m=+460.029073675" watchObservedRunningTime="2026-04-20 14:34:35.822937817 +0000 UTC m=+460.029930256" Apr 20 14:34:55.835776 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:55.835742 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb"] Apr 20 14:34:55.838830 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:55.838812 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb" Apr 20 14:34:55.841836 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:55.841812 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 14:34:55.841964 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:55.841814 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 14:34:55.841964 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:55.841814 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-wjn8h\"" Apr 20 14:34:55.842196 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:55.842178 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 14:34:55.842289 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:55.842219 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 14:34:55.851822 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:55.851796 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb"] Apr 20 14:34:55.906879 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:55.906846 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e712a1b7-aae0-4453-b6d1-13e91b06477d-webhook-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-bmvsb\" (UID: \"e712a1b7-aae0-4453-b6d1-13e91b06477d\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb" Apr 20 14:34:55.907049 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:55.906891 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t55d\" (UniqueName: \"kubernetes.io/projected/e712a1b7-aae0-4453-b6d1-13e91b06477d-kube-api-access-5t55d\") pod \"opendatahub-operator-controller-manager-65c545df94-bmvsb\" (UID: \"e712a1b7-aae0-4453-b6d1-13e91b06477d\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb" Apr 20 14:34:55.907049 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:55.906999 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e712a1b7-aae0-4453-b6d1-13e91b06477d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-bmvsb\" (UID: \"e712a1b7-aae0-4453-b6d1-13e91b06477d\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb" Apr 20 14:34:56.008097 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:56.008063 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5t55d\" (UniqueName: \"kubernetes.io/projected/e712a1b7-aae0-4453-b6d1-13e91b06477d-kube-api-access-5t55d\") pod \"opendatahub-operator-controller-manager-65c545df94-bmvsb\" (UID: \"e712a1b7-aae0-4453-b6d1-13e91b06477d\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb" Apr 20 14:34:56.008293 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:56.008159 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e712a1b7-aae0-4453-b6d1-13e91b06477d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-bmvsb\" (UID: \"e712a1b7-aae0-4453-b6d1-13e91b06477d\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb" Apr 20 14:34:56.008293 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:56.008214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e712a1b7-aae0-4453-b6d1-13e91b06477d-webhook-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-bmvsb\" (UID: \"e712a1b7-aae0-4453-b6d1-13e91b06477d\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb" Apr 20 14:34:56.010634 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:56.010592 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e712a1b7-aae0-4453-b6d1-13e91b06477d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-bmvsb\" (UID: \"e712a1b7-aae0-4453-b6d1-13e91b06477d\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb" Apr 20 14:34:56.010752 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:56.010644 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e712a1b7-aae0-4453-b6d1-13e91b06477d-webhook-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-bmvsb\" (UID: \"e712a1b7-aae0-4453-b6d1-13e91b06477d\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb" Apr 20 14:34:56.025814 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:56.025790 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t55d\" (UniqueName: \"kubernetes.io/projected/e712a1b7-aae0-4453-b6d1-13e91b06477d-kube-api-access-5t55d\") pod \"opendatahub-operator-controller-manager-65c545df94-bmvsb\" (UID: \"e712a1b7-aae0-4453-b6d1-13e91b06477d\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb" Apr 20 14:34:56.149640 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:56.149539 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb" Apr 20 14:34:56.268746 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:56.268712 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb"] Apr 20 14:34:56.272574 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:34:56.272541 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode712a1b7_aae0_4453_b6d1_13e91b06477d.slice/crio-344896e541e43311aca0aa7a30a999dad3265f89e77a5683d4dc23579d4978be WatchSource:0}: Error finding container 344896e541e43311aca0aa7a30a999dad3265f89e77a5683d4dc23579d4978be: Status 404 returned error can't find the container with id 344896e541e43311aca0aa7a30a999dad3265f89e77a5683d4dc23579d4978be Apr 20 14:34:56.867257 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:56.867213 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb" event={"ID":"e712a1b7-aae0-4453-b6d1-13e91b06477d","Type":"ContainerStarted","Data":"344896e541e43311aca0aa7a30a999dad3265f89e77a5683d4dc23579d4978be"} Apr 20 14:34:58.875370 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:58.875328 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb" event={"ID":"e712a1b7-aae0-4453-b6d1-13e91b06477d","Type":"ContainerStarted","Data":"4e2f4daa14b6dcb074cc12ce48e55b6005a278e5e303a3698e203b88133236ba"} Apr 20 14:34:58.875795 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:58.875472 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb" Apr 20 14:34:58.896433 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:34:58.896384 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb" podStartSLOduration=1.4445422030000001 podStartE2EDuration="3.8963694s" podCreationTimestamp="2026-04-20 14:34:55 +0000 UTC" firstStartedPulling="2026-04-20 14:34:56.274553594 +0000 UTC m=+480.481546010" lastFinishedPulling="2026-04-20 14:34:58.726380786 +0000 UTC m=+482.933373207" observedRunningTime="2026-04-20 14:34:58.895819428 +0000 UTC m=+483.102811877" watchObservedRunningTime="2026-04-20 14:34:58.8963694 +0000 UTC m=+483.103361838" Apr 20 14:35:05.849561 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.849526 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk"] Apr 20 14:35:05.852944 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.852925 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" Apr 20 14:35:05.856531 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.856456 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 14:35:05.856531 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.856477 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 14:35:05.856531 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.856512 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-gcnhf\"" Apr 20 14:35:05.856769 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.856462 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 14:35:05.868606 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.868582 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk"] Apr 20 14:35:05.880882 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.880852 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wggw2\" (UniqueName: \"kubernetes.io/projected/a7afbeac-dacc-4295-a2a3-d5072f22a1b1-kube-api-access-wggw2\") pod \"lws-controller-manager-bc7d4767f-lmgxk\" (UID: \"a7afbeac-dacc-4295-a2a3-d5072f22a1b1\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" Apr 20 14:35:05.880882 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.880885 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a7afbeac-dacc-4295-a2a3-d5072f22a1b1-metrics-cert\") pod \"lws-controller-manager-bc7d4767f-lmgxk\" (UID: \"a7afbeac-dacc-4295-a2a3-d5072f22a1b1\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" Apr 20 14:35:05.881072 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.880922 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7afbeac-dacc-4295-a2a3-d5072f22a1b1-cert\") pod \"lws-controller-manager-bc7d4767f-lmgxk\" (UID: \"a7afbeac-dacc-4295-a2a3-d5072f22a1b1\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" Apr 20 14:35:05.881072 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.880993 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a7afbeac-dacc-4295-a2a3-d5072f22a1b1-manager-config\") pod \"lws-controller-manager-bc7d4767f-lmgxk\" (UID: \"a7afbeac-dacc-4295-a2a3-d5072f22a1b1\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" Apr 20 14:35:05.982227 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.982189 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7afbeac-dacc-4295-a2a3-d5072f22a1b1-cert\") pod \"lws-controller-manager-bc7d4767f-lmgxk\" (UID: \"a7afbeac-dacc-4295-a2a3-d5072f22a1b1\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" Apr 20 14:35:05.982415 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.982242 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a7afbeac-dacc-4295-a2a3-d5072f22a1b1-manager-config\") pod \"lws-controller-manager-bc7d4767f-lmgxk\" (UID: \"a7afbeac-dacc-4295-a2a3-d5072f22a1b1\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" Apr 20 14:35:05.982415 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.982290 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wggw2\" (UniqueName: \"kubernetes.io/projected/a7afbeac-dacc-4295-a2a3-d5072f22a1b1-kube-api-access-wggw2\") pod \"lws-controller-manager-bc7d4767f-lmgxk\" (UID: \"a7afbeac-dacc-4295-a2a3-d5072f22a1b1\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" Apr 20 14:35:05.982415 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.982328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a7afbeac-dacc-4295-a2a3-d5072f22a1b1-metrics-cert\") pod \"lws-controller-manager-bc7d4767f-lmgxk\" (UID: \"a7afbeac-dacc-4295-a2a3-d5072f22a1b1\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" Apr 20 14:35:05.982967 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.982947 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a7afbeac-dacc-4295-a2a3-d5072f22a1b1-manager-config\") pod \"lws-controller-manager-bc7d4767f-lmgxk\" (UID: \"a7afbeac-dacc-4295-a2a3-d5072f22a1b1\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" Apr 20 14:35:05.984748 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.984725 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7afbeac-dacc-4295-a2a3-d5072f22a1b1-cert\") pod \"lws-controller-manager-bc7d4767f-lmgxk\" (UID: \"a7afbeac-dacc-4295-a2a3-d5072f22a1b1\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" Apr 20 14:35:05.984833 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.984768 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a7afbeac-dacc-4295-a2a3-d5072f22a1b1-metrics-cert\") pod \"lws-controller-manager-bc7d4767f-lmgxk\" (UID: \"a7afbeac-dacc-4295-a2a3-d5072f22a1b1\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" Apr 20 14:35:05.994818 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:05.994797 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wggw2\" (UniqueName: \"kubernetes.io/projected/a7afbeac-dacc-4295-a2a3-d5072f22a1b1-kube-api-access-wggw2\") pod \"lws-controller-manager-bc7d4767f-lmgxk\" (UID: \"a7afbeac-dacc-4295-a2a3-d5072f22a1b1\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" Apr 20 14:35:06.162348 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:06.162261 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" Apr 20 14:35:06.292243 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:06.292160 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk"] Apr 20 14:35:06.295188 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:35:06.295161 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7afbeac_dacc_4295_a2a3_d5072f22a1b1.slice/crio-4ddc78c9016156899fd2ab255631a26abb19e1752ecaafab64947f93530bc6df WatchSource:0}: Error finding container 4ddc78c9016156899fd2ab255631a26abb19e1752ecaafab64947f93530bc6df: Status 404 returned error can't find the container with id 4ddc78c9016156899fd2ab255631a26abb19e1752ecaafab64947f93530bc6df Apr 20 14:35:06.905262 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:06.905221 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" event={"ID":"a7afbeac-dacc-4295-a2a3-d5072f22a1b1","Type":"ContainerStarted","Data":"4ddc78c9016156899fd2ab255631a26abb19e1752ecaafab64947f93530bc6df"} Apr 20 14:35:08.911474 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:08.911443 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" event={"ID":"a7afbeac-dacc-4295-a2a3-d5072f22a1b1","Type":"ContainerStarted","Data":"16ce052c1a14ad1537fc842613ab149b5361cbc33810a25edd45543381f4a501"} Apr 20 14:35:08.911888 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:08.911675 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" Apr 20 14:35:08.939431 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:08.939389 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" podStartSLOduration=2.380222592 podStartE2EDuration="3.939375652s" podCreationTimestamp="2026-04-20 14:35:05 +0000 UTC" firstStartedPulling="2026-04-20 14:35:06.297448628 +0000 UTC m=+490.504441045" lastFinishedPulling="2026-04-20 14:35:07.856601687 +0000 UTC m=+492.063594105" observedRunningTime="2026-04-20 14:35:08.937703308 +0000 UTC m=+493.144695785" watchObservedRunningTime="2026-04-20 14:35:08.939375652 +0000 UTC m=+493.146368090" Apr 20 14:35:09.880574 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:09.880545 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-bmvsb" Apr 20 14:35:19.917152 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:19.917098 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-lmgxk" Apr 20 14:35:51.100105 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.100028 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5"] Apr 20 14:35:51.103598 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.103570 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.106852 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.106826 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-bx2g6\"" Apr 20 14:35:51.106977 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.106826 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 14:35:51.106977 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.106877 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 14:35:51.107077 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.106981 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 14:35:51.112542 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.112522 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5"] Apr 20 14:35:51.257882 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.257847 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.257882 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.257882 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.258100 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.257909 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.258100 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.257949 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.258100 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.257979 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.258100 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.258002 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvl2n\" (UniqueName: \"kubernetes.io/projected/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-kube-api-access-rvl2n\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.258100 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.258020 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.258100 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.258045 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.258316 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.258108 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.359341 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.359264 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.359341 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.359302 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.359555 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.359343 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.359555 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.359369 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvl2n\" (UniqueName: \"kubernetes.io/projected/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-kube-api-access-rvl2n\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.359555 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.359387 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.359555 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.359413 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.359555 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.359449 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.359555 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.359489 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.359848 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.359625 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.359904 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.359847 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.359904 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.359881 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.359985 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.359934 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.359985 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.359931 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.360251 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.360232 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.361807 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.361787 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.361985 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.361962 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.367070 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.367042 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.367214 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.367198 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvl2n\" (UniqueName: \"kubernetes.io/projected/0e9100e6-cd2c-43b5-999b-a00fa7a5048c-kube-api-access-rvl2n\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5\" (UID: \"0e9100e6-cd2c-43b5-999b-a00fa7a5048c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.415085 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.415048 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:51.539578 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:51.539543 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5"] Apr 20 14:35:51.544114 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:35:51.544086 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e9100e6_cd2c_43b5_999b_a00fa7a5048c.slice/crio-a9ce9b9a726f7d899ef15e1d596e4edd19307d55b2899bce15444bf4e789999e WatchSource:0}: Error finding container a9ce9b9a726f7d899ef15e1d596e4edd19307d55b2899bce15444bf4e789999e: Status 404 returned error can't find the container with id a9ce9b9a726f7d899ef15e1d596e4edd19307d55b2899bce15444bf4e789999e Apr 20 14:35:52.039036 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:52.038990 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" event={"ID":"0e9100e6-cd2c-43b5-999b-a00fa7a5048c","Type":"ContainerStarted","Data":"a9ce9b9a726f7d899ef15e1d596e4edd19307d55b2899bce15444bf4e789999e"} Apr 20 14:35:54.088701 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:54.088656 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 14:35:54.089072 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:54.088743 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 14:35:54.089072 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:54.088787 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 14:35:55.050329 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:55.050292 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" event={"ID":"0e9100e6-cd2c-43b5-999b-a00fa7a5048c","Type":"ContainerStarted","Data":"c164ec583e997c31eb87ffd60cde43e70ccd4dafb63d88a55f6f96408ba2a0c1"} Apr 20 14:35:55.074841 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:55.074785 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" podStartSLOduration=1.532436459 podStartE2EDuration="4.074770154s" podCreationTimestamp="2026-04-20 14:35:51 +0000 UTC" firstStartedPulling="2026-04-20 14:35:51.546075694 +0000 UTC m=+535.753068126" lastFinishedPulling="2026-04-20 14:35:54.0884094 +0000 UTC m=+538.295401821" observedRunningTime="2026-04-20 14:35:55.072841387 +0000 UTC m=+539.279833826" watchObservedRunningTime="2026-04-20 14:35:55.074770154 +0000 UTC m=+539.281762592" Apr 20 14:35:55.415734 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:55.415646 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:55.420164 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:55.420120 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:56.053668 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:56.053631 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:35:56.054662 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:35:56.054645 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5" Apr 20 14:36:14.193236 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.193199 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-fl8sn"] Apr 20 14:36:14.196768 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.196751 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-fl8sn" Apr 20 14:36:14.199604 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.199568 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 14:36:14.199604 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.199597 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 14:36:14.199604 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.199568 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-lh58z\"" Apr 20 14:36:14.206451 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.206427 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-fl8sn"] Apr 20 14:36:14.248030 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.247994 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrfq9\" (UniqueName: \"kubernetes.io/projected/0b0776a8-5378-4755-acf6-99873a181e8b-kube-api-access-mrfq9\") pod \"kuadrant-operator-catalog-fl8sn\" (UID: \"0b0776a8-5378-4755-acf6-99873a181e8b\") " pod="kuadrant-system/kuadrant-operator-catalog-fl8sn" Apr 20 14:36:14.348630 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.348597 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrfq9\" (UniqueName: \"kubernetes.io/projected/0b0776a8-5378-4755-acf6-99873a181e8b-kube-api-access-mrfq9\") pod \"kuadrant-operator-catalog-fl8sn\" (UID: \"0b0776a8-5378-4755-acf6-99873a181e8b\") " pod="kuadrant-system/kuadrant-operator-catalog-fl8sn" Apr 20 14:36:14.363535 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.363513 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrfq9\" (UniqueName: \"kubernetes.io/projected/0b0776a8-5378-4755-acf6-99873a181e8b-kube-api-access-mrfq9\") pod \"kuadrant-operator-catalog-fl8sn\" (UID: \"0b0776a8-5378-4755-acf6-99873a181e8b\") " pod="kuadrant-system/kuadrant-operator-catalog-fl8sn" Apr 20 14:36:14.506247 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.506212 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-fl8sn" Apr 20 14:36:14.563937 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.563907 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-fl8sn"] Apr 20 14:36:14.630967 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.630935 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-fl8sn"] Apr 20 14:36:14.633890 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:36:14.633857 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b0776a8_5378_4755_acf6_99873a181e8b.slice/crio-7ef4b3f73cfac79833a31ce94d474e167a4e0dd874c35a39c399e380fee9daec WatchSource:0}: Error finding container 7ef4b3f73cfac79833a31ce94d474e167a4e0dd874c35a39c399e380fee9daec: Status 404 returned error can't find the container with id 7ef4b3f73cfac79833a31ce94d474e167a4e0dd874c35a39c399e380fee9daec Apr 20 14:36:14.776035 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.775952 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-jjmkz"] Apr 20 14:36:14.780294 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.780278 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-jjmkz" Apr 20 14:36:14.790203 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.790175 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-jjmkz"] Apr 20 14:36:14.853287 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.853252 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv848\" (UniqueName: \"kubernetes.io/projected/f5cce871-f463-4954-8e13-a8ed9232b69a-kube-api-access-fv848\") pod \"kuadrant-operator-catalog-jjmkz\" (UID: \"f5cce871-f463-4954-8e13-a8ed9232b69a\") " pod="kuadrant-system/kuadrant-operator-catalog-jjmkz" Apr 20 14:36:14.954071 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.954032 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fv848\" (UniqueName: \"kubernetes.io/projected/f5cce871-f463-4954-8e13-a8ed9232b69a-kube-api-access-fv848\") pod \"kuadrant-operator-catalog-jjmkz\" (UID: \"f5cce871-f463-4954-8e13-a8ed9232b69a\") " pod="kuadrant-system/kuadrant-operator-catalog-jjmkz" Apr 20 14:36:14.963453 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:14.963421 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv848\" (UniqueName: \"kubernetes.io/projected/f5cce871-f463-4954-8e13-a8ed9232b69a-kube-api-access-fv848\") pod \"kuadrant-operator-catalog-jjmkz\" (UID: \"f5cce871-f463-4954-8e13-a8ed9232b69a\") " pod="kuadrant-system/kuadrant-operator-catalog-jjmkz" Apr 20 14:36:15.090375 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:15.090293 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-jjmkz" Apr 20 14:36:15.113398 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:15.113367 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-fl8sn" event={"ID":"0b0776a8-5378-4755-acf6-99873a181e8b","Type":"ContainerStarted","Data":"7ef4b3f73cfac79833a31ce94d474e167a4e0dd874c35a39c399e380fee9daec"} Apr 20 14:36:15.215146 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:15.215100 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-jjmkz"] Apr 20 14:36:15.235499 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:36:15.235466 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5cce871_f463_4954_8e13_a8ed9232b69a.slice/crio-f23ae6a49dc70a34592cb0207c1e35bdea244b5a6dadde590eb43e5b7a0acd1a WatchSource:0}: Error finding container f23ae6a49dc70a34592cb0207c1e35bdea244b5a6dadde590eb43e5b7a0acd1a: Status 404 returned error can't find the container with id f23ae6a49dc70a34592cb0207c1e35bdea244b5a6dadde590eb43e5b7a0acd1a Apr 20 14:36:16.118252 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:16.118211 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-jjmkz" event={"ID":"f5cce871-f463-4954-8e13-a8ed9232b69a","Type":"ContainerStarted","Data":"f23ae6a49dc70a34592cb0207c1e35bdea244b5a6dadde590eb43e5b7a0acd1a"} Apr 20 14:36:17.122452 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:17.122354 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-fl8sn" event={"ID":"0b0776a8-5378-4755-acf6-99873a181e8b","Type":"ContainerStarted","Data":"bd3a6fa0a763febc80e92580c02e4ea43325a4fa8021496270488c250e85e8c8"} Apr 20 14:36:17.122876 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:17.122443 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-fl8sn" podUID="0b0776a8-5378-4755-acf6-99873a181e8b" containerName="registry-server" containerID="cri-o://bd3a6fa0a763febc80e92580c02e4ea43325a4fa8021496270488c250e85e8c8" gracePeriod=2 Apr 20 14:36:17.123765 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:17.123728 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-jjmkz" event={"ID":"f5cce871-f463-4954-8e13-a8ed9232b69a","Type":"ContainerStarted","Data":"5e0b336aab80368d5685bdeab25281cad71df16f5e9e7f31f850e8711269069b"} Apr 20 14:36:17.139741 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:17.139695 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-fl8sn" podStartSLOduration=1.044573647 podStartE2EDuration="3.139682519s" podCreationTimestamp="2026-04-20 14:36:14 +0000 UTC" firstStartedPulling="2026-04-20 14:36:14.635279765 +0000 UTC m=+558.842272185" lastFinishedPulling="2026-04-20 14:36:16.730388635 +0000 UTC m=+560.937381057" observedRunningTime="2026-04-20 14:36:17.138511062 +0000 UTC m=+561.345503501" watchObservedRunningTime="2026-04-20 14:36:17.139682519 +0000 UTC m=+561.346674958" Apr 20 14:36:17.156630 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:17.156586 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-jjmkz" podStartSLOduration=1.660073715 podStartE2EDuration="3.156575016s" podCreationTimestamp="2026-04-20 14:36:14 +0000 UTC" firstStartedPulling="2026-04-20 14:36:15.236929837 +0000 UTC m=+559.443922258" lastFinishedPulling="2026-04-20 14:36:16.733431129 +0000 UTC m=+560.940423559" observedRunningTime="2026-04-20 14:36:17.156203184 +0000 UTC m=+561.363195625" watchObservedRunningTime="2026-04-20 14:36:17.156575016 +0000 UTC m=+561.363567456" Apr 20 14:36:17.358959 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:17.358938 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-fl8sn" Apr 20 14:36:17.476362 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:17.476332 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrfq9\" (UniqueName: \"kubernetes.io/projected/0b0776a8-5378-4755-acf6-99873a181e8b-kube-api-access-mrfq9\") pod \"0b0776a8-5378-4755-acf6-99873a181e8b\" (UID: \"0b0776a8-5378-4755-acf6-99873a181e8b\") " Apr 20 14:36:17.478483 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:17.478459 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b0776a8-5378-4755-acf6-99873a181e8b-kube-api-access-mrfq9" (OuterVolumeSpecName: "kube-api-access-mrfq9") pod "0b0776a8-5378-4755-acf6-99873a181e8b" (UID: "0b0776a8-5378-4755-acf6-99873a181e8b"). InnerVolumeSpecName "kube-api-access-mrfq9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:36:17.577194 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:17.577159 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mrfq9\" (UniqueName: \"kubernetes.io/projected/0b0776a8-5378-4755-acf6-99873a181e8b-kube-api-access-mrfq9\") on node \"ip-10-0-140-30.ec2.internal\" DevicePath \"\"" Apr 20 14:36:18.127422 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:18.127326 2572 generic.go:358] "Generic (PLEG): container finished" podID="0b0776a8-5378-4755-acf6-99873a181e8b" containerID="bd3a6fa0a763febc80e92580c02e4ea43325a4fa8021496270488c250e85e8c8" exitCode=0 Apr 20 14:36:18.127422 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:18.127383 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-fl8sn" Apr 20 14:36:18.127422 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:18.127411 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-fl8sn" event={"ID":"0b0776a8-5378-4755-acf6-99873a181e8b","Type":"ContainerDied","Data":"bd3a6fa0a763febc80e92580c02e4ea43325a4fa8021496270488c250e85e8c8"} Apr 20 14:36:18.127960 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:18.127445 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-fl8sn" event={"ID":"0b0776a8-5378-4755-acf6-99873a181e8b","Type":"ContainerDied","Data":"7ef4b3f73cfac79833a31ce94d474e167a4e0dd874c35a39c399e380fee9daec"} Apr 20 14:36:18.127960 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:18.127462 2572 scope.go:117] "RemoveContainer" containerID="bd3a6fa0a763febc80e92580c02e4ea43325a4fa8021496270488c250e85e8c8" Apr 20 14:36:18.135926 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:18.135907 2572 scope.go:117] "RemoveContainer" containerID="bd3a6fa0a763febc80e92580c02e4ea43325a4fa8021496270488c250e85e8c8" Apr 20 14:36:18.136190 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:36:18.136167 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd3a6fa0a763febc80e92580c02e4ea43325a4fa8021496270488c250e85e8c8\": container with ID starting with bd3a6fa0a763febc80e92580c02e4ea43325a4fa8021496270488c250e85e8c8 not found: ID does not exist" containerID="bd3a6fa0a763febc80e92580c02e4ea43325a4fa8021496270488c250e85e8c8" Apr 20 14:36:18.136244 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:18.136202 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd3a6fa0a763febc80e92580c02e4ea43325a4fa8021496270488c250e85e8c8"} err="failed to get container status \"bd3a6fa0a763febc80e92580c02e4ea43325a4fa8021496270488c250e85e8c8\": rpc error: code = NotFound desc = could not find container \"bd3a6fa0a763febc80e92580c02e4ea43325a4fa8021496270488c250e85e8c8\": container with ID starting with bd3a6fa0a763febc80e92580c02e4ea43325a4fa8021496270488c250e85e8c8 not found: ID does not exist" Apr 20 14:36:18.148104 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:18.148081 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-fl8sn"] Apr 20 14:36:18.152995 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:18.152971 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-fl8sn"] Apr 20 14:36:18.467246 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:18.467213 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b0776a8-5378-4755-acf6-99873a181e8b" path="/var/lib/kubelet/pods/0b0776a8-5378-4755-acf6-99873a181e8b/volumes" Apr 20 14:36:25.091492 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:25.091449 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-jjmkz" Apr 20 14:36:25.091954 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:25.091524 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-jjmkz" Apr 20 14:36:25.112362 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:25.112338 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-jjmkz" Apr 20 14:36:25.171002 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:25.170977 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-jjmkz" Apr 20 14:36:40.920373 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:40.920335 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-rrtcj"] Apr 20 14:36:40.920739 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:40.920644 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b0776a8-5378-4755-acf6-99873a181e8b" containerName="registry-server" Apr 20 14:36:40.920739 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:40.920656 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b0776a8-5378-4755-acf6-99873a181e8b" containerName="registry-server" Apr 20 14:36:40.920739 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:40.920702 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b0776a8-5378-4755-acf6-99873a181e8b" containerName="registry-server" Apr 20 14:36:40.924384 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:40.924365 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-rrtcj" Apr 20 14:36:40.927072 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:40.927055 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-9t5pk\"" Apr 20 14:36:40.940522 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:40.940496 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-rrtcj"] Apr 20 14:36:41.067354 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:41.067313 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xnvw\" (UniqueName: \"kubernetes.io/projected/e66278c8-aaf6-44bd-b6be-098b50d2d90e-kube-api-access-5xnvw\") pod \"authorino-operator-657f44b778-rrtcj\" (UID: \"e66278c8-aaf6-44bd-b6be-098b50d2d90e\") " pod="kuadrant-system/authorino-operator-657f44b778-rrtcj" Apr 20 14:36:41.168411 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:41.168377 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xnvw\" (UniqueName: \"kubernetes.io/projected/e66278c8-aaf6-44bd-b6be-098b50d2d90e-kube-api-access-5xnvw\") pod \"authorino-operator-657f44b778-rrtcj\" (UID: \"e66278c8-aaf6-44bd-b6be-098b50d2d90e\") " pod="kuadrant-system/authorino-operator-657f44b778-rrtcj" Apr 20 14:36:41.181163 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:41.181075 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xnvw\" (UniqueName: \"kubernetes.io/projected/e66278c8-aaf6-44bd-b6be-098b50d2d90e-kube-api-access-5xnvw\") pod \"authorino-operator-657f44b778-rrtcj\" (UID: \"e66278c8-aaf6-44bd-b6be-098b50d2d90e\") " pod="kuadrant-system/authorino-operator-657f44b778-rrtcj" Apr 20 14:36:41.234144 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:41.234088 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-rrtcj" Apr 20 14:36:41.358529 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:41.358506 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-rrtcj"] Apr 20 14:36:42.208973 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:42.208936 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-rrtcj" event={"ID":"e66278c8-aaf6-44bd-b6be-098b50d2d90e","Type":"ContainerStarted","Data":"62e19c0063184c39b287a9c39da79f4eb659683230cdf53ae05a7758b962146c"} Apr 20 14:36:44.216666 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:44.216630 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-rrtcj" event={"ID":"e66278c8-aaf6-44bd-b6be-098b50d2d90e","Type":"ContainerStarted","Data":"b6f9dd221b325287e7149e36692e4b2726348885610f8e2b8be033693cadacbc"} Apr 20 14:36:44.217074 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:44.216798 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-rrtcj" Apr 20 14:36:44.232430 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:44.232382 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-rrtcj" podStartSLOduration=2.13049961 podStartE2EDuration="4.232367373s" podCreationTimestamp="2026-04-20 14:36:40 +0000 UTC" firstStartedPulling="2026-04-20 14:36:41.367243159 +0000 UTC m=+585.574235578" lastFinishedPulling="2026-04-20 14:36:43.469110922 +0000 UTC m=+587.676103341" observedRunningTime="2026-04-20 14:36:44.231898757 +0000 UTC m=+588.438891197" watchObservedRunningTime="2026-04-20 14:36:44.232367373 +0000 UTC m=+588.439359813" Apr 20 14:36:44.478138 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:44.478046 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-jx76k"] Apr 20 14:36:44.480677 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:44.480662 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jx76k" Apr 20 14:36:44.483157 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:44.483137 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-lbw4c\"" Apr 20 14:36:44.483274 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:44.483142 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 14:36:44.489670 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:44.489635 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-jx76k"] Apr 20 14:36:44.598059 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:44.598024 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mmzl\" (UniqueName: \"kubernetes.io/projected/60de93f7-8e60-4bae-870d-deb50faba7d5-kube-api-access-6mmzl\") pod \"dns-operator-controller-manager-648d5c98bc-jx76k\" (UID: \"60de93f7-8e60-4bae-870d-deb50faba7d5\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jx76k" Apr 20 14:36:44.699237 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:44.699201 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mmzl\" (UniqueName: \"kubernetes.io/projected/60de93f7-8e60-4bae-870d-deb50faba7d5-kube-api-access-6mmzl\") pod \"dns-operator-controller-manager-648d5c98bc-jx76k\" (UID: \"60de93f7-8e60-4bae-870d-deb50faba7d5\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jx76k" Apr 20 14:36:44.715906 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:44.715871 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mmzl\" (UniqueName: \"kubernetes.io/projected/60de93f7-8e60-4bae-870d-deb50faba7d5-kube-api-access-6mmzl\") pod \"dns-operator-controller-manager-648d5c98bc-jx76k\" (UID: \"60de93f7-8e60-4bae-870d-deb50faba7d5\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jx76k" Apr 20 14:36:44.791121 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:44.791034 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jx76k" Apr 20 14:36:44.933214 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:44.933178 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-jx76k"] Apr 20 14:36:44.936302 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:36:44.936271 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60de93f7_8e60_4bae_870d_deb50faba7d5.slice/crio-40b787a7a0422154ad64afb48acec044110326e00a9a5734330387cdb901cdb5 WatchSource:0}: Error finding container 40b787a7a0422154ad64afb48acec044110326e00a9a5734330387cdb901cdb5: Status 404 returned error can't find the container with id 40b787a7a0422154ad64afb48acec044110326e00a9a5734330387cdb901cdb5 Apr 20 14:36:45.221255 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:45.221221 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jx76k" event={"ID":"60de93f7-8e60-4bae-870d-deb50faba7d5","Type":"ContainerStarted","Data":"40b787a7a0422154ad64afb48acec044110326e00a9a5734330387cdb901cdb5"} Apr 20 14:36:47.228135 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:47.228085 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jx76k" event={"ID":"60de93f7-8e60-4bae-870d-deb50faba7d5","Type":"ContainerStarted","Data":"c045ca52f614718335b77978b070de4da6412fa126363f288afcd459eaff6fc3"} Apr 20 14:36:47.228609 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:47.228209 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jx76k" Apr 20 14:36:47.276787 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:47.276737 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jx76k" podStartSLOduration=1.515621657 podStartE2EDuration="3.276720334s" podCreationTimestamp="2026-04-20 14:36:44 +0000 UTC" firstStartedPulling="2026-04-20 14:36:44.938805719 +0000 UTC m=+589.145798136" lastFinishedPulling="2026-04-20 14:36:46.699904396 +0000 UTC m=+590.906896813" observedRunningTime="2026-04-20 14:36:47.274430628 +0000 UTC m=+591.481423067" watchObservedRunningTime="2026-04-20 14:36:47.276720334 +0000 UTC m=+591.483712772" Apr 20 14:36:50.942147 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:50.942097 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk"] Apr 20 14:36:50.946506 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:50.946489 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk" Apr 20 14:36:50.948987 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:50.948970 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-h229m\"" Apr 20 14:36:50.949158 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:50.949118 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb5k7\" (UniqueName: \"kubernetes.io/projected/1729bed1-1d0e-4d67-b111-36a482865107-kube-api-access-pb5k7\") pod \"limitador-operator-controller-manager-85c4996f8c-ftndk\" (UID: \"1729bed1-1d0e-4d67-b111-36a482865107\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk" Apr 20 14:36:50.956743 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:50.956719 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk"] Apr 20 14:36:51.050462 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:51.050431 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb5k7\" (UniqueName: \"kubernetes.io/projected/1729bed1-1d0e-4d67-b111-36a482865107-kube-api-access-pb5k7\") pod \"limitador-operator-controller-manager-85c4996f8c-ftndk\" (UID: \"1729bed1-1d0e-4d67-b111-36a482865107\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk" Apr 20 14:36:51.065260 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:51.065235 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb5k7\" (UniqueName: \"kubernetes.io/projected/1729bed1-1d0e-4d67-b111-36a482865107-kube-api-access-pb5k7\") pod \"limitador-operator-controller-manager-85c4996f8c-ftndk\" (UID: \"1729bed1-1d0e-4d67-b111-36a482865107\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk" Apr 20 14:36:51.257624 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:51.257586 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk" Apr 20 14:36:51.377414 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:51.377390 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk"] Apr 20 14:36:51.380229 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:36:51.380201 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1729bed1_1d0e_4d67_b111_36a482865107.slice/crio-f7fd46606dd43cc0755da7d925a3a1f88c2c40010d8d79836273e570329e154b WatchSource:0}: Error finding container f7fd46606dd43cc0755da7d925a3a1f88c2c40010d8d79836273e570329e154b: Status 404 returned error can't find the container with id f7fd46606dd43cc0755da7d925a3a1f88c2c40010d8d79836273e570329e154b Apr 20 14:36:52.244909 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:52.244873 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk" event={"ID":"1729bed1-1d0e-4d67-b111-36a482865107","Type":"ContainerStarted","Data":"f7fd46606dd43cc0755da7d925a3a1f88c2c40010d8d79836273e570329e154b"} Apr 20 14:36:54.254649 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:54.254606 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk" event={"ID":"1729bed1-1d0e-4d67-b111-36a482865107","Type":"ContainerStarted","Data":"cf827c9ca602517b3ada1edd25c76d2ec585c5fa216fb55e647fbcd049ef2e50"} Apr 20 14:36:54.255032 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:54.254733 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk" Apr 20 14:36:54.274231 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:54.274183 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk" podStartSLOduration=2.20069418 podStartE2EDuration="4.27416741s" podCreationTimestamp="2026-04-20 14:36:50 +0000 UTC" firstStartedPulling="2026-04-20 14:36:51.382468166 +0000 UTC m=+595.589460596" lastFinishedPulling="2026-04-20 14:36:53.45594141 +0000 UTC m=+597.662933826" observedRunningTime="2026-04-20 14:36:54.272783074 +0000 UTC m=+598.479775540" watchObservedRunningTime="2026-04-20 14:36:54.27416741 +0000 UTC m=+598.481159849" Apr 20 14:36:55.224069 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:55.224037 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-rrtcj" Apr 20 14:36:56.374288 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:56.374263 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 14:36:56.374694 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:56.374271 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 14:36:58.233869 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:36:58.233838 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jx76k" Apr 20 14:37:05.260838 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:05.260806 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk" Apr 20 14:37:09.283487 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.283450 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk"] Apr 20 14:37:09.283958 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.283759 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk" podUID="1729bed1-1d0e-4d67-b111-36a482865107" containerName="manager" containerID="cri-o://cf827c9ca602517b3ada1edd25c76d2ec585c5fa216fb55e647fbcd049ef2e50" gracePeriod=2 Apr 20 14:37:09.293055 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.293029 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk"] Apr 20 14:37:09.328776 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.328746 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bjpj4"] Apr 20 14:37:09.329199 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.329184 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1729bed1-1d0e-4d67-b111-36a482865107" containerName="manager" Apr 20 14:37:09.329254 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.329203 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1729bed1-1d0e-4d67-b111-36a482865107" containerName="manager" Apr 20 14:37:09.329307 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.329295 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1729bed1-1d0e-4d67-b111-36a482865107" containerName="manager" Apr 20 14:37:09.332302 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.332280 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bjpj4" Apr 20 14:37:09.334715 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.334682 2572 status_manager.go:895] "Failed to get status for pod" podUID="1729bed1-1d0e-4d67-b111-36a482865107" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk" err="pods \"limitador-operator-controller-manager-85c4996f8c-ftndk\" is forbidden: User \"system:node:ip-10-0-140-30.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-30.ec2.internal' and this object" Apr 20 14:37:09.343609 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.343586 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bjpj4"] Apr 20 14:37:09.399620 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.399592 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w74mp\" (UniqueName: \"kubernetes.io/projected/03712bcc-6d66-4b0e-8fc2-75f85bdf622b-kube-api-access-w74mp\") pod \"limitador-operator-controller-manager-85c4996f8c-bjpj4\" (UID: \"03712bcc-6d66-4b0e-8fc2-75f85bdf622b\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bjpj4" Apr 20 14:37:09.500493 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.500464 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w74mp\" (UniqueName: \"kubernetes.io/projected/03712bcc-6d66-4b0e-8fc2-75f85bdf622b-kube-api-access-w74mp\") pod \"limitador-operator-controller-manager-85c4996f8c-bjpj4\" (UID: \"03712bcc-6d66-4b0e-8fc2-75f85bdf622b\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bjpj4" Apr 20 14:37:09.511140 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.511110 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk" Apr 20 14:37:09.511246 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.511148 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w74mp\" (UniqueName: \"kubernetes.io/projected/03712bcc-6d66-4b0e-8fc2-75f85bdf622b-kube-api-access-w74mp\") pod \"limitador-operator-controller-manager-85c4996f8c-bjpj4\" (UID: \"03712bcc-6d66-4b0e-8fc2-75f85bdf622b\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bjpj4" Apr 20 14:37:09.513212 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.513182 2572 status_manager.go:895] "Failed to get status for pod" podUID="1729bed1-1d0e-4d67-b111-36a482865107" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk" err="pods \"limitador-operator-controller-manager-85c4996f8c-ftndk\" is forbidden: User \"system:node:ip-10-0-140-30.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-30.ec2.internal' and this object" Apr 20 14:37:09.601721 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.601652 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb5k7\" (UniqueName: \"kubernetes.io/projected/1729bed1-1d0e-4d67-b111-36a482865107-kube-api-access-pb5k7\") pod \"1729bed1-1d0e-4d67-b111-36a482865107\" (UID: \"1729bed1-1d0e-4d67-b111-36a482865107\") " Apr 20 14:37:09.603557 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.603535 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1729bed1-1d0e-4d67-b111-36a482865107-kube-api-access-pb5k7" (OuterVolumeSpecName: "kube-api-access-pb5k7") pod "1729bed1-1d0e-4d67-b111-36a482865107" (UID: "1729bed1-1d0e-4d67-b111-36a482865107"). InnerVolumeSpecName "kube-api-access-pb5k7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:37:09.667052 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.667019 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bjpj4" Apr 20 14:37:09.703267 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.703234 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pb5k7\" (UniqueName: \"kubernetes.io/projected/1729bed1-1d0e-4d67-b111-36a482865107-kube-api-access-pb5k7\") on node \"ip-10-0-140-30.ec2.internal\" DevicePath \"\"" Apr 20 14:37:09.784898 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:09.784875 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bjpj4"] Apr 20 14:37:09.787398 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:37:09.787372 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03712bcc_6d66_4b0e_8fc2_75f85bdf622b.slice/crio-f480b6dc08565f7e0bd43b8b522f63c285c7454e4b86c87a761ccff7e5ca9a04 WatchSource:0}: Error finding container f480b6dc08565f7e0bd43b8b522f63c285c7454e4b86c87a761ccff7e5ca9a04: Status 404 returned error can't find the container with id f480b6dc08565f7e0bd43b8b522f63c285c7454e4b86c87a761ccff7e5ca9a04 Apr 20 14:37:10.309731 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:10.309698 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bjpj4" event={"ID":"03712bcc-6d66-4b0e-8fc2-75f85bdf622b","Type":"ContainerStarted","Data":"3a5744647ed26f00a113d0553ecaa55f26ba7f2dc67e3502cc855f18b309680d"} Apr 20 14:37:10.309731 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:10.309734 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bjpj4" event={"ID":"03712bcc-6d66-4b0e-8fc2-75f85bdf622b","Type":"ContainerStarted","Data":"f480b6dc08565f7e0bd43b8b522f63c285c7454e4b86c87a761ccff7e5ca9a04"} Apr 20 14:37:10.310235 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:10.309766 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bjpj4" Apr 20 14:37:10.310862 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:10.310835 2572 generic.go:358] "Generic (PLEG): container finished" podID="1729bed1-1d0e-4d67-b111-36a482865107" containerID="cf827c9ca602517b3ada1edd25c76d2ec585c5fa216fb55e647fbcd049ef2e50" exitCode=0 Apr 20 14:37:10.310969 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:10.310869 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk" Apr 20 14:37:10.310969 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:10.310926 2572 scope.go:117] "RemoveContainer" containerID="cf827c9ca602517b3ada1edd25c76d2ec585c5fa216fb55e647fbcd049ef2e50" Apr 20 14:37:10.312063 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:10.312042 2572 status_manager.go:895] "Failed to get status for pod" podUID="1729bed1-1d0e-4d67-b111-36a482865107" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk" err="pods \"limitador-operator-controller-manager-85c4996f8c-ftndk\" is forbidden: User \"system:node:ip-10-0-140-30.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-30.ec2.internal' and this object" Apr 20 14:37:10.318820 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:10.318801 2572 scope.go:117] "RemoveContainer" containerID="cf827c9ca602517b3ada1edd25c76d2ec585c5fa216fb55e647fbcd049ef2e50" Apr 20 14:37:10.319079 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:37:10.319059 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf827c9ca602517b3ada1edd25c76d2ec585c5fa216fb55e647fbcd049ef2e50\": container with ID starting with cf827c9ca602517b3ada1edd25c76d2ec585c5fa216fb55e647fbcd049ef2e50 not found: ID does not exist" containerID="cf827c9ca602517b3ada1edd25c76d2ec585c5fa216fb55e647fbcd049ef2e50" Apr 20 14:37:10.319159 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:10.319090 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf827c9ca602517b3ada1edd25c76d2ec585c5fa216fb55e647fbcd049ef2e50"} err="failed to get container status \"cf827c9ca602517b3ada1edd25c76d2ec585c5fa216fb55e647fbcd049ef2e50\": rpc error: code = NotFound desc = could not find container \"cf827c9ca602517b3ada1edd25c76d2ec585c5fa216fb55e647fbcd049ef2e50\": container with ID starting with cf827c9ca602517b3ada1edd25c76d2ec585c5fa216fb55e647fbcd049ef2e50 not found: ID does not exist" Apr 20 14:37:10.331341 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:10.331300 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bjpj4" podStartSLOduration=1.331289211 podStartE2EDuration="1.331289211s" podCreationTimestamp="2026-04-20 14:37:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:37:10.32943328 +0000 UTC m=+614.536425719" watchObservedRunningTime="2026-04-20 14:37:10.331289211 +0000 UTC m=+614.538281650" Apr 20 14:37:10.331476 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:10.331454 2572 status_manager.go:895] "Failed to get status for pod" podUID="1729bed1-1d0e-4d67-b111-36a482865107" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ftndk" err="pods \"limitador-operator-controller-manager-85c4996f8c-ftndk\" is forbidden: User \"system:node:ip-10-0-140-30.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-30.ec2.internal' and this object" Apr 20 14:37:10.467137 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:10.467088 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1729bed1-1d0e-4d67-b111-36a482865107" path="/var/lib/kubelet/pods/1729bed1-1d0e-4d67-b111-36a482865107/volumes" Apr 20 14:37:21.317709 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:21.317618 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bjpj4" Apr 20 14:37:37.912218 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:37.912181 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg"] Apr 20 14:37:37.915717 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:37.915694 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:37.919196 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:37.919172 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-hdws7\"" Apr 20 14:37:37.928477 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:37.928454 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg"] Apr 20 14:37:38.028445 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.028408 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9b585b4d-a336-46c5-b72f-94313026996d-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.028587 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.028465 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/9b585b4d-a336-46c5-b72f-94313026996d-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.028587 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.028492 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/9b585b4d-a336-46c5-b72f-94313026996d-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.028587 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.028517 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqsns\" (UniqueName: \"kubernetes.io/projected/9b585b4d-a336-46c5-b72f-94313026996d-kube-api-access-dqsns\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.028587 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.028548 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/9b585b4d-a336-46c5-b72f-94313026996d-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.028587 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.028564 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/9b585b4d-a336-46c5-b72f-94313026996d-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.028587 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.028581 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/9b585b4d-a336-46c5-b72f-94313026996d-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.028814 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.028615 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/9b585b4d-a336-46c5-b72f-94313026996d-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.028814 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.028662 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9b585b4d-a336-46c5-b72f-94313026996d-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.129515 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.129491 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/9b585b4d-a336-46c5-b72f-94313026996d-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.129685 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.129522 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/9b585b4d-a336-46c5-b72f-94313026996d-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.129685 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.129548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqsns\" (UniqueName: \"kubernetes.io/projected/9b585b4d-a336-46c5-b72f-94313026996d-kube-api-access-dqsns\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.129685 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.129579 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/9b585b4d-a336-46c5-b72f-94313026996d-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.129685 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.129605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/9b585b4d-a336-46c5-b72f-94313026996d-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.129685 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.129630 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/9b585b4d-a336-46c5-b72f-94313026996d-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.129685 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.129677 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/9b585b4d-a336-46c5-b72f-94313026996d-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.129925 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.129710 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9b585b4d-a336-46c5-b72f-94313026996d-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.129925 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.129745 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9b585b4d-a336-46c5-b72f-94313026996d-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.130063 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.130033 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/9b585b4d-a336-46c5-b72f-94313026996d-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.130208 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.130149 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/9b585b4d-a336-46c5-b72f-94313026996d-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.130208 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.130199 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/9b585b4d-a336-46c5-b72f-94313026996d-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.130296 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.130258 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/9b585b4d-a336-46c5-b72f-94313026996d-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.130442 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.130424 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/9b585b4d-a336-46c5-b72f-94313026996d-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.132064 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.132043 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/9b585b4d-a336-46c5-b72f-94313026996d-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.132177 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.132155 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9b585b4d-a336-46c5-b72f-94313026996d-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.138304 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.138282 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9b585b4d-a336-46c5-b72f-94313026996d-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.138454 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.138432 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqsns\" (UniqueName: \"kubernetes.io/projected/9b585b4d-a336-46c5-b72f-94313026996d-kube-api-access-dqsns\") pod \"maas-default-gateway-openshift-default-58b6f876-ldmzg\" (UID: \"9b585b4d-a336-46c5-b72f-94313026996d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.227568 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.227550 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:38.344473 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.344385 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg"] Apr 20 14:37:38.346814 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:37:38.346786 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b585b4d_a336_46c5_b72f_94313026996d.slice/crio-4d17d2940b00e064508e046ec1563c47c0a62820aa41cdfbab16c53acfdc5da8 WatchSource:0}: Error finding container 4d17d2940b00e064508e046ec1563c47c0a62820aa41cdfbab16c53acfdc5da8: Status 404 returned error can't find the container with id 4d17d2940b00e064508e046ec1563c47c0a62820aa41cdfbab16c53acfdc5da8 Apr 20 14:37:38.348935 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.348899 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 14:37:38.349024 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.348980 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 14:37:38.349064 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.349023 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 14:37:38.405271 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.405246 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" event={"ID":"9b585b4d-a336-46c5-b72f-94313026996d","Type":"ContainerStarted","Data":"d4b367e67930f7c63c7269d7e89e7365eb1f1385c0ca2cd6a9ee72945e0515c8"} Apr 20 14:37:38.405354 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.405279 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" event={"ID":"9b585b4d-a336-46c5-b72f-94313026996d","Type":"ContainerStarted","Data":"4d17d2940b00e064508e046ec1563c47c0a62820aa41cdfbab16c53acfdc5da8"} Apr 20 14:37:38.422995 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:38.422949 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" podStartSLOduration=1.422933832 podStartE2EDuration="1.422933832s" podCreationTimestamp="2026-04-20 14:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:37:38.422017136 +0000 UTC m=+642.629009576" watchObservedRunningTime="2026-04-20 14:37:38.422933832 +0000 UTC m=+642.629926267" Apr 20 14:37:39.228181 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:39.228151 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:39.233051 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:39.233028 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:39.413201 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:39.413163 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:39.414266 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:39.414234 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldmzg" Apr 20 14:37:52.412432 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:52.412394 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-99v9k"] Apr 20 14:37:52.415318 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:52.415294 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-99v9k" Apr 20 14:37:52.417741 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:52.417719 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-vjs46\"" Apr 20 14:37:52.422910 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:52.422890 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-99v9k"] Apr 20 14:37:52.542311 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:52.542277 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67f58\" (UniqueName: \"kubernetes.io/projected/8be2bf6d-cc08-48a8-be71-a7c74f7d46c7-kube-api-access-67f58\") pod \"authorino-f99f4b5cd-99v9k\" (UID: \"8be2bf6d-cc08-48a8-be71-a7c74f7d46c7\") " pod="kuadrant-system/authorino-f99f4b5cd-99v9k" Apr 20 14:37:52.596348 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:52.596316 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-kb9bg"] Apr 20 14:37:52.599503 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:52.599486 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-kb9bg" Apr 20 14:37:52.601465 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:52.601444 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-kb9bg"] Apr 20 14:37:52.642996 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:52.642973 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67f58\" (UniqueName: \"kubernetes.io/projected/8be2bf6d-cc08-48a8-be71-a7c74f7d46c7-kube-api-access-67f58\") pod \"authorino-f99f4b5cd-99v9k\" (UID: \"8be2bf6d-cc08-48a8-be71-a7c74f7d46c7\") " pod="kuadrant-system/authorino-f99f4b5cd-99v9k" Apr 20 14:37:52.651635 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:52.651597 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67f58\" (UniqueName: \"kubernetes.io/projected/8be2bf6d-cc08-48a8-be71-a7c74f7d46c7-kube-api-access-67f58\") pod \"authorino-f99f4b5cd-99v9k\" (UID: \"8be2bf6d-cc08-48a8-be71-a7c74f7d46c7\") " pod="kuadrant-system/authorino-f99f4b5cd-99v9k" Apr 20 14:37:52.726110 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:52.726085 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-99v9k" Apr 20 14:37:52.744051 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:52.744024 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n44k8\" (UniqueName: \"kubernetes.io/projected/be6ae1da-7583-4c2d-85ca-17d8e898c0bf-kube-api-access-n44k8\") pod \"authorino-7498df8756-kb9bg\" (UID: \"be6ae1da-7583-4c2d-85ca-17d8e898c0bf\") " pod="kuadrant-system/authorino-7498df8756-kb9bg" Apr 20 14:37:52.842504 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:52.842482 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-99v9k"] Apr 20 14:37:52.844676 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:52.844651 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n44k8\" (UniqueName: \"kubernetes.io/projected/be6ae1da-7583-4c2d-85ca-17d8e898c0bf-kube-api-access-n44k8\") pod \"authorino-7498df8756-kb9bg\" (UID: \"be6ae1da-7583-4c2d-85ca-17d8e898c0bf\") " pod="kuadrant-system/authorino-7498df8756-kb9bg" Apr 20 14:37:52.845155 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:37:52.845107 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8be2bf6d_cc08_48a8_be71_a7c74f7d46c7.slice/crio-7d22fb431a63abef2db8dee73f99b4779029c3edb018fc9b599d1cb819f31819 WatchSource:0}: Error finding container 7d22fb431a63abef2db8dee73f99b4779029c3edb018fc9b599d1cb819f31819: Status 404 returned error can't find the container with id 7d22fb431a63abef2db8dee73f99b4779029c3edb018fc9b599d1cb819f31819 Apr 20 14:37:52.853376 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:52.853353 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n44k8\" (UniqueName: \"kubernetes.io/projected/be6ae1da-7583-4c2d-85ca-17d8e898c0bf-kube-api-access-n44k8\") pod \"authorino-7498df8756-kb9bg\" (UID: \"be6ae1da-7583-4c2d-85ca-17d8e898c0bf\") " pod="kuadrant-system/authorino-7498df8756-kb9bg" Apr 20 14:37:52.909606 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:52.909580 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-kb9bg" Apr 20 14:37:53.021504 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:53.021476 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-kb9bg"] Apr 20 14:37:53.023880 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:37:53.023855 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe6ae1da_7583_4c2d_85ca_17d8e898c0bf.slice/crio-3b84f5fbaa91ecc49092fdfe041e116d3035cfb32afea2f884a19ce9a00d67b8 WatchSource:0}: Error finding container 3b84f5fbaa91ecc49092fdfe041e116d3035cfb32afea2f884a19ce9a00d67b8: Status 404 returned error can't find the container with id 3b84f5fbaa91ecc49092fdfe041e116d3035cfb32afea2f884a19ce9a00d67b8 Apr 20 14:37:53.454969 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:53.454885 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-kb9bg" event={"ID":"be6ae1da-7583-4c2d-85ca-17d8e898c0bf","Type":"ContainerStarted","Data":"3b84f5fbaa91ecc49092fdfe041e116d3035cfb32afea2f884a19ce9a00d67b8"} Apr 20 14:37:53.455931 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:53.455906 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-99v9k" event={"ID":"8be2bf6d-cc08-48a8-be71-a7c74f7d46c7","Type":"ContainerStarted","Data":"7d22fb431a63abef2db8dee73f99b4779029c3edb018fc9b599d1cb819f31819"} Apr 20 14:37:56.467337 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:56.467302 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-99v9k" event={"ID":"8be2bf6d-cc08-48a8-be71-a7c74f7d46c7","Type":"ContainerStarted","Data":"d36e4471102808a17ab0384d9cdd0a1c6c2521ceb57e3138ad64b9a2cfcce673"} Apr 20 14:37:56.468327 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:56.468304 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-kb9bg" event={"ID":"be6ae1da-7583-4c2d-85ca-17d8e898c0bf","Type":"ContainerStarted","Data":"ed9f3eeda7cdd8bf4eea7a51c938d7d5020375e7366a82c40c895f946b41701b"} Apr 20 14:37:56.503154 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:56.503097 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-99v9k" podStartSLOduration=1.829985417 podStartE2EDuration="4.503085842s" podCreationTimestamp="2026-04-20 14:37:52 +0000 UTC" firstStartedPulling="2026-04-20 14:37:52.846472242 +0000 UTC m=+657.053464671" lastFinishedPulling="2026-04-20 14:37:55.519572678 +0000 UTC m=+659.726565096" observedRunningTime="2026-04-20 14:37:56.501621641 +0000 UTC m=+660.708614080" watchObservedRunningTime="2026-04-20 14:37:56.503085842 +0000 UTC m=+660.710078282" Apr 20 14:37:56.518286 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:56.518164 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-kb9bg" podStartSLOduration=2.033757681 podStartE2EDuration="4.51814828s" podCreationTimestamp="2026-04-20 14:37:52 +0000 UTC" firstStartedPulling="2026-04-20 14:37:53.02521863 +0000 UTC m=+657.232211051" lastFinishedPulling="2026-04-20 14:37:55.509609234 +0000 UTC m=+659.716601650" observedRunningTime="2026-04-20 14:37:56.516837098 +0000 UTC m=+660.723829538" watchObservedRunningTime="2026-04-20 14:37:56.51814828 +0000 UTC m=+660.725140720" Apr 20 14:37:56.542300 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:56.542264 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-99v9k"] Apr 20 14:37:58.474999 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:58.474960 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-99v9k" podUID="8be2bf6d-cc08-48a8-be71-a7c74f7d46c7" containerName="authorino" containerID="cri-o://d36e4471102808a17ab0384d9cdd0a1c6c2521ceb57e3138ad64b9a2cfcce673" gracePeriod=30 Apr 20 14:37:58.706166 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:58.706111 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-99v9k" Apr 20 14:37:58.892577 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:58.892498 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67f58\" (UniqueName: \"kubernetes.io/projected/8be2bf6d-cc08-48a8-be71-a7c74f7d46c7-kube-api-access-67f58\") pod \"8be2bf6d-cc08-48a8-be71-a7c74f7d46c7\" (UID: \"8be2bf6d-cc08-48a8-be71-a7c74f7d46c7\") " Apr 20 14:37:58.894511 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:58.894479 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8be2bf6d-cc08-48a8-be71-a7c74f7d46c7-kube-api-access-67f58" (OuterVolumeSpecName: "kube-api-access-67f58") pod "8be2bf6d-cc08-48a8-be71-a7c74f7d46c7" (UID: "8be2bf6d-cc08-48a8-be71-a7c74f7d46c7"). InnerVolumeSpecName "kube-api-access-67f58". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:37:58.994149 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:58.994087 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-67f58\" (UniqueName: \"kubernetes.io/projected/8be2bf6d-cc08-48a8-be71-a7c74f7d46c7-kube-api-access-67f58\") on node \"ip-10-0-140-30.ec2.internal\" DevicePath \"\"" Apr 20 14:37:59.479381 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:59.479345 2572 generic.go:358] "Generic (PLEG): container finished" podID="8be2bf6d-cc08-48a8-be71-a7c74f7d46c7" containerID="d36e4471102808a17ab0384d9cdd0a1c6c2521ceb57e3138ad64b9a2cfcce673" exitCode=0 Apr 20 14:37:59.479833 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:59.479401 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-99v9k" Apr 20 14:37:59.479833 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:59.479420 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-99v9k" event={"ID":"8be2bf6d-cc08-48a8-be71-a7c74f7d46c7","Type":"ContainerDied","Data":"d36e4471102808a17ab0384d9cdd0a1c6c2521ceb57e3138ad64b9a2cfcce673"} Apr 20 14:37:59.479833 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:59.479453 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-99v9k" event={"ID":"8be2bf6d-cc08-48a8-be71-a7c74f7d46c7","Type":"ContainerDied","Data":"7d22fb431a63abef2db8dee73f99b4779029c3edb018fc9b599d1cb819f31819"} Apr 20 14:37:59.479833 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:59.479473 2572 scope.go:117] "RemoveContainer" containerID="d36e4471102808a17ab0384d9cdd0a1c6c2521ceb57e3138ad64b9a2cfcce673" Apr 20 14:37:59.487296 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:59.487276 2572 scope.go:117] "RemoveContainer" containerID="d36e4471102808a17ab0384d9cdd0a1c6c2521ceb57e3138ad64b9a2cfcce673" Apr 20 14:37:59.487556 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:37:59.487538 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d36e4471102808a17ab0384d9cdd0a1c6c2521ceb57e3138ad64b9a2cfcce673\": container with ID starting with d36e4471102808a17ab0384d9cdd0a1c6c2521ceb57e3138ad64b9a2cfcce673 not found: ID does not exist" containerID="d36e4471102808a17ab0384d9cdd0a1c6c2521ceb57e3138ad64b9a2cfcce673" Apr 20 14:37:59.487628 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:59.487564 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d36e4471102808a17ab0384d9cdd0a1c6c2521ceb57e3138ad64b9a2cfcce673"} err="failed to get container status \"d36e4471102808a17ab0384d9cdd0a1c6c2521ceb57e3138ad64b9a2cfcce673\": rpc error: code = NotFound desc = could not find container \"d36e4471102808a17ab0384d9cdd0a1c6c2521ceb57e3138ad64b9a2cfcce673\": container with ID starting with d36e4471102808a17ab0384d9cdd0a1c6c2521ceb57e3138ad64b9a2cfcce673 not found: ID does not exist" Apr 20 14:37:59.499264 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:59.499233 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-99v9k"] Apr 20 14:37:59.504520 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:37:59.504499 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-99v9k"] Apr 20 14:38:00.467581 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:00.467548 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8be2bf6d-cc08-48a8-be71-a7c74f7d46c7" path="/var/lib/kubelet/pods/8be2bf6d-cc08-48a8-be71-a7c74f7d46c7/volumes" Apr 20 14:38:21.260778 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.260743 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-hvzzl"] Apr 20 14:38:21.261233 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.261214 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8be2bf6d-cc08-48a8-be71-a7c74f7d46c7" containerName="authorino" Apr 20 14:38:21.261301 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.261237 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be2bf6d-cc08-48a8-be71-a7c74f7d46c7" containerName="authorino" Apr 20 14:38:21.261344 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.261316 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8be2bf6d-cc08-48a8-be71-a7c74f7d46c7" containerName="authorino" Apr 20 14:38:21.264500 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.264476 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-hvzzl" Apr 20 14:38:21.270228 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.270195 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-hvzzl"] Apr 20 14:38:21.383228 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.383188 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c465w\" (UniqueName: \"kubernetes.io/projected/323fe8ec-b078-45da-a522-a05f20dd6695-kube-api-access-c465w\") pod \"authorino-8b475cf9f-hvzzl\" (UID: \"323fe8ec-b078-45da-a522-a05f20dd6695\") " pod="kuadrant-system/authorino-8b475cf9f-hvzzl" Apr 20 14:38:21.484494 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.484459 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c465w\" (UniqueName: \"kubernetes.io/projected/323fe8ec-b078-45da-a522-a05f20dd6695-kube-api-access-c465w\") pod \"authorino-8b475cf9f-hvzzl\" (UID: \"323fe8ec-b078-45da-a522-a05f20dd6695\") " pod="kuadrant-system/authorino-8b475cf9f-hvzzl" Apr 20 14:38:21.492458 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.492427 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c465w\" (UniqueName: \"kubernetes.io/projected/323fe8ec-b078-45da-a522-a05f20dd6695-kube-api-access-c465w\") pod \"authorino-8b475cf9f-hvzzl\" (UID: \"323fe8ec-b078-45da-a522-a05f20dd6695\") " pod="kuadrant-system/authorino-8b475cf9f-hvzzl" Apr 20 14:38:21.501480 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.501454 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-hvzzl"] Apr 20 14:38:21.501704 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.501692 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-hvzzl" Apr 20 14:38:21.528191 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.528158 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7575d44875-6sj6r"] Apr 20 14:38:21.532572 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.532543 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7575d44875-6sj6r" Apr 20 14:38:21.539061 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.539021 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7575d44875-6sj6r"] Apr 20 14:38:21.585998 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.585909 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jft9\" (UniqueName: \"kubernetes.io/projected/927dfb41-e18c-483b-8260-8cbdc8b22e89-kube-api-access-7jft9\") pod \"authorino-7575d44875-6sj6r\" (UID: \"927dfb41-e18c-483b-8260-8cbdc8b22e89\") " pod="kuadrant-system/authorino-7575d44875-6sj6r" Apr 20 14:38:21.628780 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.628754 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-hvzzl"] Apr 20 14:38:21.631319 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:38:21.631293 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod323fe8ec_b078_45da_a522_a05f20dd6695.slice/crio-12a25de31de30eca9d7d4c135d6f28cfc794e6b08d885aa0f1ee916ccb2dfeb2 WatchSource:0}: Error finding container 12a25de31de30eca9d7d4c135d6f28cfc794e6b08d885aa0f1ee916ccb2dfeb2: Status 404 returned error can't find the container with id 12a25de31de30eca9d7d4c135d6f28cfc794e6b08d885aa0f1ee916ccb2dfeb2 Apr 20 14:38:21.687036 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.687006 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jft9\" (UniqueName: \"kubernetes.io/projected/927dfb41-e18c-483b-8260-8cbdc8b22e89-kube-api-access-7jft9\") pod \"authorino-7575d44875-6sj6r\" (UID: \"927dfb41-e18c-483b-8260-8cbdc8b22e89\") " pod="kuadrant-system/authorino-7575d44875-6sj6r" Apr 20 14:38:21.695356 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.695325 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jft9\" (UniqueName: \"kubernetes.io/projected/927dfb41-e18c-483b-8260-8cbdc8b22e89-kube-api-access-7jft9\") pod \"authorino-7575d44875-6sj6r\" (UID: \"927dfb41-e18c-483b-8260-8cbdc8b22e89\") " pod="kuadrant-system/authorino-7575d44875-6sj6r" Apr 20 14:38:21.844239 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.844161 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7575d44875-6sj6r" Apr 20 14:38:21.850893 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.850866 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7575d44875-6sj6r"] Apr 20 14:38:21.881541 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.879753 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-6cc55c779-2dzhk"] Apr 20 14:38:21.886840 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.886804 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6cc55c779-2dzhk" Apr 20 14:38:21.887003 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.886863 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6cc55c779-2dzhk"] Apr 20 14:38:21.889563 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.889540 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 14:38:21.971032 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.971009 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7575d44875-6sj6r"] Apr 20 14:38:21.973065 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:38:21.973033 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod927dfb41_e18c_483b_8260_8cbdc8b22e89.slice/crio-261e337495b3bff1be889b3579c0cfc0d1d24e0690b2a41b84bdea0247d7b8dc WatchSource:0}: Error finding container 261e337495b3bff1be889b3579c0cfc0d1d24e0690b2a41b84bdea0247d7b8dc: Status 404 returned error can't find the container with id 261e337495b3bff1be889b3579c0cfc0d1d24e0690b2a41b84bdea0247d7b8dc Apr 20 14:38:21.989302 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.989271 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glzgz\" (UniqueName: \"kubernetes.io/projected/9b6a99b7-42a5-4dde-8285-ebeceef1122e-kube-api-access-glzgz\") pod \"authorino-6cc55c779-2dzhk\" (UID: \"9b6a99b7-42a5-4dde-8285-ebeceef1122e\") " pod="kuadrant-system/authorino-6cc55c779-2dzhk" Apr 20 14:38:21.989422 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:21.989378 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9b6a99b7-42a5-4dde-8285-ebeceef1122e-tls-cert\") pod \"authorino-6cc55c779-2dzhk\" (UID: \"9b6a99b7-42a5-4dde-8285-ebeceef1122e\") " pod="kuadrant-system/authorino-6cc55c779-2dzhk" Apr 20 14:38:22.090471 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.090445 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9b6a99b7-42a5-4dde-8285-ebeceef1122e-tls-cert\") pod \"authorino-6cc55c779-2dzhk\" (UID: \"9b6a99b7-42a5-4dde-8285-ebeceef1122e\") " pod="kuadrant-system/authorino-6cc55c779-2dzhk" Apr 20 14:38:22.090582 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.090493 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glzgz\" (UniqueName: \"kubernetes.io/projected/9b6a99b7-42a5-4dde-8285-ebeceef1122e-kube-api-access-glzgz\") pod \"authorino-6cc55c779-2dzhk\" (UID: \"9b6a99b7-42a5-4dde-8285-ebeceef1122e\") " pod="kuadrant-system/authorino-6cc55c779-2dzhk" Apr 20 14:38:22.092637 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.092618 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9b6a99b7-42a5-4dde-8285-ebeceef1122e-tls-cert\") pod \"authorino-6cc55c779-2dzhk\" (UID: \"9b6a99b7-42a5-4dde-8285-ebeceef1122e\") " pod="kuadrant-system/authorino-6cc55c779-2dzhk" Apr 20 14:38:22.097548 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.097487 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glzgz\" (UniqueName: \"kubernetes.io/projected/9b6a99b7-42a5-4dde-8285-ebeceef1122e-kube-api-access-glzgz\") pod \"authorino-6cc55c779-2dzhk\" (UID: \"9b6a99b7-42a5-4dde-8285-ebeceef1122e\") " pod="kuadrant-system/authorino-6cc55c779-2dzhk" Apr 20 14:38:22.198415 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.198378 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6cc55c779-2dzhk" Apr 20 14:38:22.316541 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.316516 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6cc55c779-2dzhk"] Apr 20 14:38:22.551677 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.551638 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-hvzzl" event={"ID":"323fe8ec-b078-45da-a522-a05f20dd6695","Type":"ContainerStarted","Data":"07395b1b5506191199d49a63f295513398bd4fbf5de43bd286709b1606dec6dc"} Apr 20 14:38:22.551677 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.551680 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-hvzzl" event={"ID":"323fe8ec-b078-45da-a522-a05f20dd6695","Type":"ContainerStarted","Data":"12a25de31de30eca9d7d4c135d6f28cfc794e6b08d885aa0f1ee916ccb2dfeb2"} Apr 20 14:38:22.551921 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.551707 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-hvzzl" podUID="323fe8ec-b078-45da-a522-a05f20dd6695" containerName="authorino" containerID="cri-o://07395b1b5506191199d49a63f295513398bd4fbf5de43bd286709b1606dec6dc" gracePeriod=30 Apr 20 14:38:22.552786 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.552762 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6cc55c779-2dzhk" event={"ID":"9b6a99b7-42a5-4dde-8285-ebeceef1122e","Type":"ContainerStarted","Data":"48b88f76441598641aa1028d098d8a88749eb3760aa62acd3b59a174e21ad178"} Apr 20 14:38:22.554118 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.554090 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7575d44875-6sj6r" event={"ID":"927dfb41-e18c-483b-8260-8cbdc8b22e89","Type":"ContainerStarted","Data":"52c35d3a784144dde0e04f8db0b3ac6b26b12bb2bc63a3ce72851d6b408b0efc"} Apr 20 14:38:22.554242 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.554144 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7575d44875-6sj6r" event={"ID":"927dfb41-e18c-483b-8260-8cbdc8b22e89","Type":"ContainerStarted","Data":"261e337495b3bff1be889b3579c0cfc0d1d24e0690b2a41b84bdea0247d7b8dc"} Apr 20 14:38:22.554242 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.554141 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7575d44875-6sj6r" podUID="927dfb41-e18c-483b-8260-8cbdc8b22e89" containerName="authorino" containerID="cri-o://52c35d3a784144dde0e04f8db0b3ac6b26b12bb2bc63a3ce72851d6b408b0efc" gracePeriod=30 Apr 20 14:38:22.566818 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.566767 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-hvzzl" podStartSLOduration=1.169150872 podStartE2EDuration="1.566750288s" podCreationTimestamp="2026-04-20 14:38:21 +0000 UTC" firstStartedPulling="2026-04-20 14:38:21.632686753 +0000 UTC m=+685.839679173" lastFinishedPulling="2026-04-20 14:38:22.030286171 +0000 UTC m=+686.237278589" observedRunningTime="2026-04-20 14:38:22.565974583 +0000 UTC m=+686.772967022" watchObservedRunningTime="2026-04-20 14:38:22.566750288 +0000 UTC m=+686.773742728" Apr 20 14:38:22.579282 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.579232 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7575d44875-6sj6r" podStartSLOduration=1.249044437 podStartE2EDuration="1.579213927s" podCreationTimestamp="2026-04-20 14:38:21 +0000 UTC" firstStartedPulling="2026-04-20 14:38:21.974322261 +0000 UTC m=+686.181314677" lastFinishedPulling="2026-04-20 14:38:22.304491746 +0000 UTC m=+686.511484167" observedRunningTime="2026-04-20 14:38:22.578664689 +0000 UTC m=+686.785657129" watchObservedRunningTime="2026-04-20 14:38:22.579213927 +0000 UTC m=+686.786206367" Apr 20 14:38:22.806516 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.806492 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7575d44875-6sj6r" Apr 20 14:38:22.827762 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.827740 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-hvzzl" Apr 20 14:38:22.898070 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.897991 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jft9\" (UniqueName: \"kubernetes.io/projected/927dfb41-e18c-483b-8260-8cbdc8b22e89-kube-api-access-7jft9\") pod \"927dfb41-e18c-483b-8260-8cbdc8b22e89\" (UID: \"927dfb41-e18c-483b-8260-8cbdc8b22e89\") " Apr 20 14:38:22.898070 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.898062 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c465w\" (UniqueName: \"kubernetes.io/projected/323fe8ec-b078-45da-a522-a05f20dd6695-kube-api-access-c465w\") pod \"323fe8ec-b078-45da-a522-a05f20dd6695\" (UID: \"323fe8ec-b078-45da-a522-a05f20dd6695\") " Apr 20 14:38:22.900118 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.900088 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/927dfb41-e18c-483b-8260-8cbdc8b22e89-kube-api-access-7jft9" (OuterVolumeSpecName: "kube-api-access-7jft9") pod "927dfb41-e18c-483b-8260-8cbdc8b22e89" (UID: "927dfb41-e18c-483b-8260-8cbdc8b22e89"). InnerVolumeSpecName "kube-api-access-7jft9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:38:22.900118 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.900102 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/323fe8ec-b078-45da-a522-a05f20dd6695-kube-api-access-c465w" (OuterVolumeSpecName: "kube-api-access-c465w") pod "323fe8ec-b078-45da-a522-a05f20dd6695" (UID: "323fe8ec-b078-45da-a522-a05f20dd6695"). InnerVolumeSpecName "kube-api-access-c465w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:38:22.999726 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.999688 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c465w\" (UniqueName: \"kubernetes.io/projected/323fe8ec-b078-45da-a522-a05f20dd6695-kube-api-access-c465w\") on node \"ip-10-0-140-30.ec2.internal\" DevicePath \"\"" Apr 20 14:38:22.999726 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:22.999720 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7jft9\" (UniqueName: \"kubernetes.io/projected/927dfb41-e18c-483b-8260-8cbdc8b22e89-kube-api-access-7jft9\") on node \"ip-10-0-140-30.ec2.internal\" DevicePath \"\"" Apr 20 14:38:23.559216 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.559180 2572 generic.go:358] "Generic (PLEG): container finished" podID="323fe8ec-b078-45da-a522-a05f20dd6695" containerID="07395b1b5506191199d49a63f295513398bd4fbf5de43bd286709b1606dec6dc" exitCode=0 Apr 20 14:38:23.559699 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.559230 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-hvzzl" Apr 20 14:38:23.559699 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.559263 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-hvzzl" event={"ID":"323fe8ec-b078-45da-a522-a05f20dd6695","Type":"ContainerDied","Data":"07395b1b5506191199d49a63f295513398bd4fbf5de43bd286709b1606dec6dc"} Apr 20 14:38:23.559699 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.559306 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-hvzzl" event={"ID":"323fe8ec-b078-45da-a522-a05f20dd6695","Type":"ContainerDied","Data":"12a25de31de30eca9d7d4c135d6f28cfc794e6b08d885aa0f1ee916ccb2dfeb2"} Apr 20 14:38:23.559699 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.559329 2572 scope.go:117] "RemoveContainer" containerID="07395b1b5506191199d49a63f295513398bd4fbf5de43bd286709b1606dec6dc" Apr 20 14:38:23.560601 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.560581 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6cc55c779-2dzhk" event={"ID":"9b6a99b7-42a5-4dde-8285-ebeceef1122e","Type":"ContainerStarted","Data":"060015acfcf9580f7f06dee401d8abc0d191638ef126066e926994ea19f967ea"} Apr 20 14:38:23.561757 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.561727 2572 generic.go:358] "Generic (PLEG): container finished" podID="927dfb41-e18c-483b-8260-8cbdc8b22e89" containerID="52c35d3a784144dde0e04f8db0b3ac6b26b12bb2bc63a3ce72851d6b408b0efc" exitCode=0 Apr 20 14:38:23.561846 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.561771 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7575d44875-6sj6r" Apr 20 14:38:23.561904 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.561773 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7575d44875-6sj6r" event={"ID":"927dfb41-e18c-483b-8260-8cbdc8b22e89","Type":"ContainerDied","Data":"52c35d3a784144dde0e04f8db0b3ac6b26b12bb2bc63a3ce72851d6b408b0efc"} Apr 20 14:38:23.561904 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.561876 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7575d44875-6sj6r" event={"ID":"927dfb41-e18c-483b-8260-8cbdc8b22e89","Type":"ContainerDied","Data":"261e337495b3bff1be889b3579c0cfc0d1d24e0690b2a41b84bdea0247d7b8dc"} Apr 20 14:38:23.570453 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.569028 2572 scope.go:117] "RemoveContainer" containerID="07395b1b5506191199d49a63f295513398bd4fbf5de43bd286709b1606dec6dc" Apr 20 14:38:23.570453 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:38:23.569677 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07395b1b5506191199d49a63f295513398bd4fbf5de43bd286709b1606dec6dc\": container with ID starting with 07395b1b5506191199d49a63f295513398bd4fbf5de43bd286709b1606dec6dc not found: ID does not exist" containerID="07395b1b5506191199d49a63f295513398bd4fbf5de43bd286709b1606dec6dc" Apr 20 14:38:23.570453 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.569710 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07395b1b5506191199d49a63f295513398bd4fbf5de43bd286709b1606dec6dc"} err="failed to get container status \"07395b1b5506191199d49a63f295513398bd4fbf5de43bd286709b1606dec6dc\": rpc error: code = NotFound desc = could not find container \"07395b1b5506191199d49a63f295513398bd4fbf5de43bd286709b1606dec6dc\": container with ID starting with 07395b1b5506191199d49a63f295513398bd4fbf5de43bd286709b1606dec6dc not found: ID does not exist" Apr 20 14:38:23.570453 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.569734 2572 scope.go:117] "RemoveContainer" containerID="52c35d3a784144dde0e04f8db0b3ac6b26b12bb2bc63a3ce72851d6b408b0efc" Apr 20 14:38:23.578649 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.578634 2572 scope.go:117] "RemoveContainer" containerID="52c35d3a784144dde0e04f8db0b3ac6b26b12bb2bc63a3ce72851d6b408b0efc" Apr 20 14:38:23.578881 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:38:23.578862 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52c35d3a784144dde0e04f8db0b3ac6b26b12bb2bc63a3ce72851d6b408b0efc\": container with ID starting with 52c35d3a784144dde0e04f8db0b3ac6b26b12bb2bc63a3ce72851d6b408b0efc not found: ID does not exist" containerID="52c35d3a784144dde0e04f8db0b3ac6b26b12bb2bc63a3ce72851d6b408b0efc" Apr 20 14:38:23.578940 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.578887 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52c35d3a784144dde0e04f8db0b3ac6b26b12bb2bc63a3ce72851d6b408b0efc"} err="failed to get container status \"52c35d3a784144dde0e04f8db0b3ac6b26b12bb2bc63a3ce72851d6b408b0efc\": rpc error: code = NotFound desc = could not find container \"52c35d3a784144dde0e04f8db0b3ac6b26b12bb2bc63a3ce72851d6b408b0efc\": container with ID starting with 52c35d3a784144dde0e04f8db0b3ac6b26b12bb2bc63a3ce72851d6b408b0efc not found: ID does not exist" Apr 20 14:38:23.580926 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.580887 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-6cc55c779-2dzhk" podStartSLOduration=2.2710442730000002 podStartE2EDuration="2.580875659s" podCreationTimestamp="2026-04-20 14:38:21 +0000 UTC" firstStartedPulling="2026-04-20 14:38:22.321653969 +0000 UTC m=+686.528646388" lastFinishedPulling="2026-04-20 14:38:22.631485138 +0000 UTC m=+686.838477774" observedRunningTime="2026-04-20 14:38:23.579533348 +0000 UTC m=+687.786525787" watchObservedRunningTime="2026-04-20 14:38:23.580875659 +0000 UTC m=+687.787868145" Apr 20 14:38:23.596154 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.596113 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7575d44875-6sj6r"] Apr 20 14:38:23.599318 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.599291 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7575d44875-6sj6r"] Apr 20 14:38:23.603463 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.603434 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-kb9bg"] Apr 20 14:38:23.603659 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.603637 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-kb9bg" podUID="be6ae1da-7583-4c2d-85ca-17d8e898c0bf" containerName="authorino" containerID="cri-o://ed9f3eeda7cdd8bf4eea7a51c938d7d5020375e7366a82c40c895f946b41701b" gracePeriod=30 Apr 20 14:38:23.617604 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.617579 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-hvzzl"] Apr 20 14:38:23.622597 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.622570 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-hvzzl"] Apr 20 14:38:23.844204 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.844181 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-kb9bg" Apr 20 14:38:23.907969 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.907939 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n44k8\" (UniqueName: \"kubernetes.io/projected/be6ae1da-7583-4c2d-85ca-17d8e898c0bf-kube-api-access-n44k8\") pod \"be6ae1da-7583-4c2d-85ca-17d8e898c0bf\" (UID: \"be6ae1da-7583-4c2d-85ca-17d8e898c0bf\") " Apr 20 14:38:23.910035 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:23.910011 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be6ae1da-7583-4c2d-85ca-17d8e898c0bf-kube-api-access-n44k8" (OuterVolumeSpecName: "kube-api-access-n44k8") pod "be6ae1da-7583-4c2d-85ca-17d8e898c0bf" (UID: "be6ae1da-7583-4c2d-85ca-17d8e898c0bf"). InnerVolumeSpecName "kube-api-access-n44k8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:38:24.009146 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:24.009104 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n44k8\" (UniqueName: \"kubernetes.io/projected/be6ae1da-7583-4c2d-85ca-17d8e898c0bf-kube-api-access-n44k8\") on node \"ip-10-0-140-30.ec2.internal\" DevicePath \"\"" Apr 20 14:38:24.467697 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:24.467663 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="323fe8ec-b078-45da-a522-a05f20dd6695" path="/var/lib/kubelet/pods/323fe8ec-b078-45da-a522-a05f20dd6695/volumes" Apr 20 14:38:24.467991 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:24.467979 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="927dfb41-e18c-483b-8260-8cbdc8b22e89" path="/var/lib/kubelet/pods/927dfb41-e18c-483b-8260-8cbdc8b22e89/volumes" Apr 20 14:38:24.567404 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:24.567369 2572 generic.go:358] "Generic (PLEG): container finished" podID="be6ae1da-7583-4c2d-85ca-17d8e898c0bf" containerID="ed9f3eeda7cdd8bf4eea7a51c938d7d5020375e7366a82c40c895f946b41701b" exitCode=0 Apr 20 14:38:24.567832 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:24.567424 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-kb9bg" Apr 20 14:38:24.567832 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:24.567441 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-kb9bg" event={"ID":"be6ae1da-7583-4c2d-85ca-17d8e898c0bf","Type":"ContainerDied","Data":"ed9f3eeda7cdd8bf4eea7a51c938d7d5020375e7366a82c40c895f946b41701b"} Apr 20 14:38:24.567832 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:24.567477 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-kb9bg" event={"ID":"be6ae1da-7583-4c2d-85ca-17d8e898c0bf","Type":"ContainerDied","Data":"3b84f5fbaa91ecc49092fdfe041e116d3035cfb32afea2f884a19ce9a00d67b8"} Apr 20 14:38:24.567832 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:24.567495 2572 scope.go:117] "RemoveContainer" containerID="ed9f3eeda7cdd8bf4eea7a51c938d7d5020375e7366a82c40c895f946b41701b" Apr 20 14:38:24.576875 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:24.576857 2572 scope.go:117] "RemoveContainer" containerID="ed9f3eeda7cdd8bf4eea7a51c938d7d5020375e7366a82c40c895f946b41701b" Apr 20 14:38:24.577149 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:38:24.577116 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed9f3eeda7cdd8bf4eea7a51c938d7d5020375e7366a82c40c895f946b41701b\": container with ID starting with ed9f3eeda7cdd8bf4eea7a51c938d7d5020375e7366a82c40c895f946b41701b not found: ID does not exist" containerID="ed9f3eeda7cdd8bf4eea7a51c938d7d5020375e7366a82c40c895f946b41701b" Apr 20 14:38:24.577201 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:24.577159 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed9f3eeda7cdd8bf4eea7a51c938d7d5020375e7366a82c40c895f946b41701b"} err="failed to get container status \"ed9f3eeda7cdd8bf4eea7a51c938d7d5020375e7366a82c40c895f946b41701b\": rpc error: code = NotFound desc = could not find container \"ed9f3eeda7cdd8bf4eea7a51c938d7d5020375e7366a82c40c895f946b41701b\": container with ID starting with ed9f3eeda7cdd8bf4eea7a51c938d7d5020375e7366a82c40c895f946b41701b not found: ID does not exist" Apr 20 14:38:24.582950 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:24.582926 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-kb9bg"] Apr 20 14:38:24.586351 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:24.586331 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-kb9bg"] Apr 20 14:38:26.467808 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:38:26.467773 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be6ae1da-7583-4c2d-85ca-17d8e898c0bf" path="/var/lib/kubelet/pods/be6ae1da-7583-4c2d-85ca-17d8e898c0bf/volumes" Apr 20 14:39:24.419204 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.419176 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs"] Apr 20 14:39:24.419584 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.419481 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="323fe8ec-b078-45da-a522-a05f20dd6695" containerName="authorino" Apr 20 14:39:24.419584 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.419491 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="323fe8ec-b078-45da-a522-a05f20dd6695" containerName="authorino" Apr 20 14:39:24.419584 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.419499 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="927dfb41-e18c-483b-8260-8cbdc8b22e89" containerName="authorino" Apr 20 14:39:24.419584 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.419505 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="927dfb41-e18c-483b-8260-8cbdc8b22e89" containerName="authorino" Apr 20 14:39:24.419584 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.419514 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be6ae1da-7583-4c2d-85ca-17d8e898c0bf" containerName="authorino" Apr 20 14:39:24.419584 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.419519 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="be6ae1da-7583-4c2d-85ca-17d8e898c0bf" containerName="authorino" Apr 20 14:39:24.419584 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.419562 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="be6ae1da-7583-4c2d-85ca-17d8e898c0bf" containerName="authorino" Apr 20 14:39:24.419584 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.419574 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="927dfb41-e18c-483b-8260-8cbdc8b22e89" containerName="authorino" Apr 20 14:39:24.419584 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.419581 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="323fe8ec-b078-45da-a522-a05f20dd6695" containerName="authorino" Apr 20 14:39:24.427419 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.427399 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.430857 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.430835 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-5p5fx\"" Apr 20 14:39:24.430986 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.430876 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 20 14:39:24.430986 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.430897 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 14:39:24.430986 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.430837 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 14:39:24.433648 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.433626 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs"] Apr 20 14:39:24.579603 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.579572 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ed286892-c718-4998-a4ce-a17cf72e45c6-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.579757 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.579618 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed286892-c718-4998-a4ce-a17cf72e45c6-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.579757 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.579723 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ed286892-c718-4998-a4ce-a17cf72e45c6-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.579874 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.579754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ed286892-c718-4998-a4ce-a17cf72e45c6-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.579874 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.579830 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ed286892-c718-4998-a4ce-a17cf72e45c6-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.579987 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.579871 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9nc7\" (UniqueName: \"kubernetes.io/projected/ed286892-c718-4998-a4ce-a17cf72e45c6-kube-api-access-v9nc7\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.680946 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.680881 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ed286892-c718-4998-a4ce-a17cf72e45c6-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.680946 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.680913 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed286892-c718-4998-a4ce-a17cf72e45c6-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.680946 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.680948 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ed286892-c718-4998-a4ce-a17cf72e45c6-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.681216 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.680986 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ed286892-c718-4998-a4ce-a17cf72e45c6-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.681216 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.681035 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ed286892-c718-4998-a4ce-a17cf72e45c6-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.681216 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.681059 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9nc7\" (UniqueName: \"kubernetes.io/projected/ed286892-c718-4998-a4ce-a17cf72e45c6-kube-api-access-v9nc7\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.681372 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.681303 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ed286892-c718-4998-a4ce-a17cf72e45c6-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.681372 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.681339 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ed286892-c718-4998-a4ce-a17cf72e45c6-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.681464 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.681408 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed286892-c718-4998-a4ce-a17cf72e45c6-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.683192 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.683174 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ed286892-c718-4998-a4ce-a17cf72e45c6-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.683559 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.683539 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ed286892-c718-4998-a4ce-a17cf72e45c6-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.689562 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.689541 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9nc7\" (UniqueName: \"kubernetes.io/projected/ed286892-c718-4998-a4ce-a17cf72e45c6-kube-api-access-v9nc7\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs\" (UID: \"ed286892-c718-4998-a4ce-a17cf72e45c6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.738431 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.738412 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:24.861438 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.861414 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs"] Apr 20 14:39:24.863550 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:39:24.863515 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded286892_c718_4998_a4ce_a17cf72e45c6.slice/crio-1aa2a7a36c4f5bfee23aedcc6f912be68c0f745ab9df211c09786d3bf5cd9aaf WatchSource:0}: Error finding container 1aa2a7a36c4f5bfee23aedcc6f912be68c0f745ab9df211c09786d3bf5cd9aaf: Status 404 returned error can't find the container with id 1aa2a7a36c4f5bfee23aedcc6f912be68c0f745ab9df211c09786d3bf5cd9aaf Apr 20 14:39:24.865327 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:24.865312 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:39:25.761738 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:25.761696 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" event={"ID":"ed286892-c718-4998-a4ce-a17cf72e45c6","Type":"ContainerStarted","Data":"1aa2a7a36c4f5bfee23aedcc6f912be68c0f745ab9df211c09786d3bf5cd9aaf"} Apr 20 14:39:30.780566 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:30.780522 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" event={"ID":"ed286892-c718-4998-a4ce-a17cf72e45c6","Type":"ContainerStarted","Data":"9958dce3de6bda8ba607011aebe8d6a7089a7124a43008c8bb22edc00535363b"} Apr 20 14:39:35.798042 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:35.798011 2572 generic.go:358] "Generic (PLEG): container finished" podID="ed286892-c718-4998-a4ce-a17cf72e45c6" containerID="9958dce3de6bda8ba607011aebe8d6a7089a7124a43008c8bb22edc00535363b" exitCode=0 Apr 20 14:39:35.798418 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:35.798082 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" event={"ID":"ed286892-c718-4998-a4ce-a17cf72e45c6","Type":"ContainerDied","Data":"9958dce3de6bda8ba607011aebe8d6a7089a7124a43008c8bb22edc00535363b"} Apr 20 14:39:37.806463 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:37.806423 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" event={"ID":"ed286892-c718-4998-a4ce-a17cf72e45c6","Type":"ContainerStarted","Data":"03260fedec470d4e26fabed8ca1882b02e4ca9ad2fd2b59e501d833dd4d4ba21"} Apr 20 14:39:37.806853 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:37.806667 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:39:37.825639 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:37.825591 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" podStartSLOduration=1.767299892 podStartE2EDuration="13.825579981s" podCreationTimestamp="2026-04-20 14:39:24 +0000 UTC" firstStartedPulling="2026-04-20 14:39:24.865436514 +0000 UTC m=+749.072428934" lastFinishedPulling="2026-04-20 14:39:36.923716606 +0000 UTC m=+761.130709023" observedRunningTime="2026-04-20 14:39:37.823219688 +0000 UTC m=+762.030212128" watchObservedRunningTime="2026-04-20 14:39:37.825579981 +0000 UTC m=+762.032572419" Apr 20 14:39:48.822792 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:39:48.822758 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs" Apr 20 14:40:23.773858 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:23.773777 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7b7b77666-x6l4l"] Apr 20 14:40:23.777376 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:23.777354 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7b7b77666-x6l4l" Apr 20 14:40:23.785689 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:23.785659 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7b7b77666-x6l4l"] Apr 20 14:40:23.847656 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:23.847632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/7952c393-a501-4432-8968-966fdce3b2a6-tls-cert\") pod \"authorino-7b7b77666-x6l4l\" (UID: \"7952c393-a501-4432-8968-966fdce3b2a6\") " pod="kuadrant-system/authorino-7b7b77666-x6l4l" Apr 20 14:40:23.847773 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:23.847681 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kqd5\" (UniqueName: \"kubernetes.io/projected/7952c393-a501-4432-8968-966fdce3b2a6-kube-api-access-4kqd5\") pod \"authorino-7b7b77666-x6l4l\" (UID: \"7952c393-a501-4432-8968-966fdce3b2a6\") " pod="kuadrant-system/authorino-7b7b77666-x6l4l" Apr 20 14:40:23.948525 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:23.948498 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/7952c393-a501-4432-8968-966fdce3b2a6-tls-cert\") pod \"authorino-7b7b77666-x6l4l\" (UID: \"7952c393-a501-4432-8968-966fdce3b2a6\") " pod="kuadrant-system/authorino-7b7b77666-x6l4l" Apr 20 14:40:23.948631 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:23.948577 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kqd5\" (UniqueName: \"kubernetes.io/projected/7952c393-a501-4432-8968-966fdce3b2a6-kube-api-access-4kqd5\") pod \"authorino-7b7b77666-x6l4l\" (UID: \"7952c393-a501-4432-8968-966fdce3b2a6\") " pod="kuadrant-system/authorino-7b7b77666-x6l4l" Apr 20 14:40:23.950756 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:23.950737 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/7952c393-a501-4432-8968-966fdce3b2a6-tls-cert\") pod \"authorino-7b7b77666-x6l4l\" (UID: \"7952c393-a501-4432-8968-966fdce3b2a6\") " pod="kuadrant-system/authorino-7b7b77666-x6l4l" Apr 20 14:40:23.955948 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:23.955929 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kqd5\" (UniqueName: \"kubernetes.io/projected/7952c393-a501-4432-8968-966fdce3b2a6-kube-api-access-4kqd5\") pod \"authorino-7b7b77666-x6l4l\" (UID: \"7952c393-a501-4432-8968-966fdce3b2a6\") " pod="kuadrant-system/authorino-7b7b77666-x6l4l" Apr 20 14:40:24.086864 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:24.086809 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7b7b77666-x6l4l" Apr 20 14:40:24.202586 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:24.202564 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7b7b77666-x6l4l"] Apr 20 14:40:24.206903 ip-10-0-140-30 kubenswrapper[2572]: W0420 14:40:24.206873 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7952c393_a501_4432_8968_966fdce3b2a6.slice/crio-47b4754941176f51e57bb162249982cc97535a66bfd988c1e60ebe6d0929502f WatchSource:0}: Error finding container 47b4754941176f51e57bb162249982cc97535a66bfd988c1e60ebe6d0929502f: Status 404 returned error can't find the container with id 47b4754941176f51e57bb162249982cc97535a66bfd988c1e60ebe6d0929502f Apr 20 14:40:24.960312 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:24.960265 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7b7b77666-x6l4l" event={"ID":"7952c393-a501-4432-8968-966fdce3b2a6","Type":"ContainerStarted","Data":"5305758e8d9a3e470a442c53a02797be8e2d150c8803d2a2c7fd9c11b40caf09"} Apr 20 14:40:24.960312 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:24.960308 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7b7b77666-x6l4l" event={"ID":"7952c393-a501-4432-8968-966fdce3b2a6","Type":"ContainerStarted","Data":"47b4754941176f51e57bb162249982cc97535a66bfd988c1e60ebe6d0929502f"} Apr 20 14:40:24.974411 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:24.974364 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7b7b77666-x6l4l" podStartSLOduration=1.450180287 podStartE2EDuration="1.974348819s" podCreationTimestamp="2026-04-20 14:40:23 +0000 UTC" firstStartedPulling="2026-04-20 14:40:24.20863015 +0000 UTC m=+808.415622566" lastFinishedPulling="2026-04-20 14:40:24.73279867 +0000 UTC m=+808.939791098" observedRunningTime="2026-04-20 14:40:24.973704733 +0000 UTC m=+809.180697172" watchObservedRunningTime="2026-04-20 14:40:24.974348819 +0000 UTC m=+809.181341257" Apr 20 14:40:25.009722 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.009642 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6cc55c779-2dzhk"] Apr 20 14:40:25.009914 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.009889 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-6cc55c779-2dzhk" podUID="9b6a99b7-42a5-4dde-8285-ebeceef1122e" containerName="authorino" containerID="cri-o://060015acfcf9580f7f06dee401d8abc0d191638ef126066e926994ea19f967ea" gracePeriod=30 Apr 20 14:40:25.247117 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.247093 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6cc55c779-2dzhk" Apr 20 14:40:25.360283 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.360224 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glzgz\" (UniqueName: \"kubernetes.io/projected/9b6a99b7-42a5-4dde-8285-ebeceef1122e-kube-api-access-glzgz\") pod \"9b6a99b7-42a5-4dde-8285-ebeceef1122e\" (UID: \"9b6a99b7-42a5-4dde-8285-ebeceef1122e\") " Apr 20 14:40:25.360283 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.360271 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9b6a99b7-42a5-4dde-8285-ebeceef1122e-tls-cert\") pod \"9b6a99b7-42a5-4dde-8285-ebeceef1122e\" (UID: \"9b6a99b7-42a5-4dde-8285-ebeceef1122e\") " Apr 20 14:40:25.362207 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.362176 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b6a99b7-42a5-4dde-8285-ebeceef1122e-kube-api-access-glzgz" (OuterVolumeSpecName: "kube-api-access-glzgz") pod "9b6a99b7-42a5-4dde-8285-ebeceef1122e" (UID: "9b6a99b7-42a5-4dde-8285-ebeceef1122e"). InnerVolumeSpecName "kube-api-access-glzgz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:40:25.369898 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.369872 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b6a99b7-42a5-4dde-8285-ebeceef1122e-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "9b6a99b7-42a5-4dde-8285-ebeceef1122e" (UID: "9b6a99b7-42a5-4dde-8285-ebeceef1122e"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:40:25.461298 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.461234 2572 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9b6a99b7-42a5-4dde-8285-ebeceef1122e-tls-cert\") on node \"ip-10-0-140-30.ec2.internal\" DevicePath \"\"" Apr 20 14:40:25.461298 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.461271 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-glzgz\" (UniqueName: \"kubernetes.io/projected/9b6a99b7-42a5-4dde-8285-ebeceef1122e-kube-api-access-glzgz\") on node \"ip-10-0-140-30.ec2.internal\" DevicePath \"\"" Apr 20 14:40:25.964684 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.964654 2572 generic.go:358] "Generic (PLEG): container finished" podID="9b6a99b7-42a5-4dde-8285-ebeceef1122e" containerID="060015acfcf9580f7f06dee401d8abc0d191638ef126066e926994ea19f967ea" exitCode=0 Apr 20 14:40:25.965033 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.964703 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6cc55c779-2dzhk" Apr 20 14:40:25.965033 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.964733 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6cc55c779-2dzhk" event={"ID":"9b6a99b7-42a5-4dde-8285-ebeceef1122e","Type":"ContainerDied","Data":"060015acfcf9580f7f06dee401d8abc0d191638ef126066e926994ea19f967ea"} Apr 20 14:40:25.965033 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.964776 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6cc55c779-2dzhk" event={"ID":"9b6a99b7-42a5-4dde-8285-ebeceef1122e","Type":"ContainerDied","Data":"48b88f76441598641aa1028d098d8a88749eb3760aa62acd3b59a174e21ad178"} Apr 20 14:40:25.965033 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.964791 2572 scope.go:117] "RemoveContainer" containerID="060015acfcf9580f7f06dee401d8abc0d191638ef126066e926994ea19f967ea" Apr 20 14:40:25.972634 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.972612 2572 scope.go:117] "RemoveContainer" containerID="060015acfcf9580f7f06dee401d8abc0d191638ef126066e926994ea19f967ea" Apr 20 14:40:25.972870 ip-10-0-140-30 kubenswrapper[2572]: E0420 14:40:25.972852 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"060015acfcf9580f7f06dee401d8abc0d191638ef126066e926994ea19f967ea\": container with ID starting with 060015acfcf9580f7f06dee401d8abc0d191638ef126066e926994ea19f967ea not found: ID does not exist" containerID="060015acfcf9580f7f06dee401d8abc0d191638ef126066e926994ea19f967ea" Apr 20 14:40:25.972923 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.972877 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060015acfcf9580f7f06dee401d8abc0d191638ef126066e926994ea19f967ea"} err="failed to get container status \"060015acfcf9580f7f06dee401d8abc0d191638ef126066e926994ea19f967ea\": rpc error: code = NotFound desc = could not find container \"060015acfcf9580f7f06dee401d8abc0d191638ef126066e926994ea19f967ea\": container with ID starting with 060015acfcf9580f7f06dee401d8abc0d191638ef126066e926994ea19f967ea not found: ID does not exist" Apr 20 14:40:25.986387 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.986368 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6cc55c779-2dzhk"] Apr 20 14:40:25.994047 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:25.994025 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-6cc55c779-2dzhk"] Apr 20 14:40:26.468022 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:40:26.467996 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b6a99b7-42a5-4dde-8285-ebeceef1122e" path="/var/lib/kubelet/pods/9b6a99b7-42a5-4dde-8285-ebeceef1122e/volumes" Apr 20 14:41:56.396353 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:41:56.396321 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 14:41:56.397311 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:41:56.397286 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 14:46:56.417535 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:46:56.417501 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 14:46:56.418801 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:46:56.418779 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 14:51:56.437637 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:51:56.437608 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 14:51:56.439527 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:51:56.439506 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 14:56:56.458905 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:56:56.458874 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 14:56:56.460386 ip-10-0-140-30 kubenswrapper[2572]: I0420 14:56:56.460364 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 15:00:00.137687 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:00.137646 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29611620-cxvsd"] Apr 20 15:00:00.138209 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:00.137969 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b6a99b7-42a5-4dde-8285-ebeceef1122e" containerName="authorino" Apr 20 15:00:00.138209 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:00.137982 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6a99b7-42a5-4dde-8285-ebeceef1122e" containerName="authorino" Apr 20 15:00:00.138209 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:00.138050 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b6a99b7-42a5-4dde-8285-ebeceef1122e" containerName="authorino" Apr 20 15:00:00.140857 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:00.140836 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611620-cxvsd" Apr 20 15:00:00.143324 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:00.143298 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-5vm9x\"" Apr 20 15:00:00.157293 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:00.157253 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611620-cxvsd"] Apr 20 15:00:00.270377 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:00.270337 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktf64\" (UniqueName: \"kubernetes.io/projected/aa20cdef-be4d-4406-a540-daac82e869e6-kube-api-access-ktf64\") pod \"maas-api-key-cleanup-29611620-cxvsd\" (UID: \"aa20cdef-be4d-4406-a540-daac82e869e6\") " pod="opendatahub/maas-api-key-cleanup-29611620-cxvsd" Apr 20 15:00:00.371432 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:00.371387 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktf64\" (UniqueName: \"kubernetes.io/projected/aa20cdef-be4d-4406-a540-daac82e869e6-kube-api-access-ktf64\") pod \"maas-api-key-cleanup-29611620-cxvsd\" (UID: \"aa20cdef-be4d-4406-a540-daac82e869e6\") " pod="opendatahub/maas-api-key-cleanup-29611620-cxvsd" Apr 20 15:00:00.380116 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:00.380085 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktf64\" (UniqueName: \"kubernetes.io/projected/aa20cdef-be4d-4406-a540-daac82e869e6-kube-api-access-ktf64\") pod \"maas-api-key-cleanup-29611620-cxvsd\" (UID: \"aa20cdef-be4d-4406-a540-daac82e869e6\") " pod="opendatahub/maas-api-key-cleanup-29611620-cxvsd" Apr 20 15:00:00.451300 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:00.451209 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611620-cxvsd" Apr 20 15:00:00.575820 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:00.575786 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611620-cxvsd"] Apr 20 15:00:00.578620 ip-10-0-140-30 kubenswrapper[2572]: W0420 15:00:00.578590 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa20cdef_be4d_4406_a540_daac82e869e6.slice/crio-316f74e1805599e81c3cd071614b0932894f9b3cddbe65acd1ff8b904899ca1a WatchSource:0}: Error finding container 316f74e1805599e81c3cd071614b0932894f9b3cddbe65acd1ff8b904899ca1a: Status 404 returned error can't find the container with id 316f74e1805599e81c3cd071614b0932894f9b3cddbe65acd1ff8b904899ca1a Apr 20 15:00:00.580524 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:00.580510 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:00:00.756414 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:00.756380 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611620-cxvsd" event={"ID":"aa20cdef-be4d-4406-a540-daac82e869e6","Type":"ContainerStarted","Data":"316f74e1805599e81c3cd071614b0932894f9b3cddbe65acd1ff8b904899ca1a"} Apr 20 15:00:05.778627 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:05.778583 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611620-cxvsd" event={"ID":"aa20cdef-be4d-4406-a540-daac82e869e6","Type":"ContainerStarted","Data":"61a5353997f9c972f83ed9b09c6f67cf7b2b36a1cb0e9dcc7d87a91371e27c47"} Apr 20 15:00:05.793930 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:05.793873 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29611620-cxvsd" podStartSLOduration=1.474376543 podStartE2EDuration="5.793857545s" podCreationTimestamp="2026-04-20 15:00:00 +0000 UTC" firstStartedPulling="2026-04-20 15:00:00.58063653 +0000 UTC m=+1984.787628947" lastFinishedPulling="2026-04-20 15:00:04.900117523 +0000 UTC m=+1989.107109949" observedRunningTime="2026-04-20 15:00:05.792289604 +0000 UTC m=+1989.999282042" watchObservedRunningTime="2026-04-20 15:00:05.793857545 +0000 UTC m=+1990.000849984" Apr 20 15:00:25.848252 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:25.848155 2572 generic.go:358] "Generic (PLEG): container finished" podID="aa20cdef-be4d-4406-a540-daac82e869e6" containerID="61a5353997f9c972f83ed9b09c6f67cf7b2b36a1cb0e9dcc7d87a91371e27c47" exitCode=6 Apr 20 15:00:25.848252 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:25.848212 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611620-cxvsd" event={"ID":"aa20cdef-be4d-4406-a540-daac82e869e6","Type":"ContainerDied","Data":"61a5353997f9c972f83ed9b09c6f67cf7b2b36a1cb0e9dcc7d87a91371e27c47"} Apr 20 15:00:25.848761 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:25.848582 2572 scope.go:117] "RemoveContainer" containerID="61a5353997f9c972f83ed9b09c6f67cf7b2b36a1cb0e9dcc7d87a91371e27c47" Apr 20 15:00:26.853744 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:26.853707 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611620-cxvsd" event={"ID":"aa20cdef-be4d-4406-a540-daac82e869e6","Type":"ContainerStarted","Data":"eeec7756c530e8e03aea2bb9aa0ad970ba1a4390ec1e958c6caf5c74b8b5c3e5"} Apr 20 15:00:46.922969 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:46.922886 2572 generic.go:358] "Generic (PLEG): container finished" podID="aa20cdef-be4d-4406-a540-daac82e869e6" containerID="eeec7756c530e8e03aea2bb9aa0ad970ba1a4390ec1e958c6caf5c74b8b5c3e5" exitCode=6 Apr 20 15:00:46.922969 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:46.922960 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611620-cxvsd" event={"ID":"aa20cdef-be4d-4406-a540-daac82e869e6","Type":"ContainerDied","Data":"eeec7756c530e8e03aea2bb9aa0ad970ba1a4390ec1e958c6caf5c74b8b5c3e5"} Apr 20 15:00:46.923543 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:46.923002 2572 scope.go:117] "RemoveContainer" containerID="61a5353997f9c972f83ed9b09c6f67cf7b2b36a1cb0e9dcc7d87a91371e27c47" Apr 20 15:00:46.923543 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:00:46.923376 2572 scope.go:117] "RemoveContainer" containerID="eeec7756c530e8e03aea2bb9aa0ad970ba1a4390ec1e958c6caf5c74b8b5c3e5" Apr 20 15:00:46.923668 ip-10-0-140-30 kubenswrapper[2572]: E0420 15:00:46.923645 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29611620-cxvsd_opendatahub(aa20cdef-be4d-4406-a540-daac82e869e6)\"" pod="opendatahub/maas-api-key-cleanup-29611620-cxvsd" podUID="aa20cdef-be4d-4406-a540-daac82e869e6" Apr 20 15:01:00.010223 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:01:00.010183 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611620-cxvsd"] Apr 20 15:01:00.134903 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:01:00.134879 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611620-cxvsd" Apr 20 15:01:00.275337 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:01:00.275237 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktf64\" (UniqueName: \"kubernetes.io/projected/aa20cdef-be4d-4406-a540-daac82e869e6-kube-api-access-ktf64\") pod \"aa20cdef-be4d-4406-a540-daac82e869e6\" (UID: \"aa20cdef-be4d-4406-a540-daac82e869e6\") " Apr 20 15:01:00.277508 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:01:00.277481 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa20cdef-be4d-4406-a540-daac82e869e6-kube-api-access-ktf64" (OuterVolumeSpecName: "kube-api-access-ktf64") pod "aa20cdef-be4d-4406-a540-daac82e869e6" (UID: "aa20cdef-be4d-4406-a540-daac82e869e6"). InnerVolumeSpecName "kube-api-access-ktf64". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:01:00.375851 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:01:00.375814 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ktf64\" (UniqueName: \"kubernetes.io/projected/aa20cdef-be4d-4406-a540-daac82e869e6-kube-api-access-ktf64\") on node \"ip-10-0-140-30.ec2.internal\" DevicePath \"\"" Apr 20 15:01:00.971995 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:01:00.971954 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611620-cxvsd" event={"ID":"aa20cdef-be4d-4406-a540-daac82e869e6","Type":"ContainerDied","Data":"316f74e1805599e81c3cd071614b0932894f9b3cddbe65acd1ff8b904899ca1a"} Apr 20 15:01:00.971995 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:01:00.971980 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611620-cxvsd" Apr 20 15:01:00.972286 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:01:00.972010 2572 scope.go:117] "RemoveContainer" containerID="eeec7756c530e8e03aea2bb9aa0ad970ba1a4390ec1e958c6caf5c74b8b5c3e5" Apr 20 15:01:00.987841 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:01:00.987813 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611620-cxvsd"] Apr 20 15:01:00.991228 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:01:00.991205 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611620-cxvsd"] Apr 20 15:01:02.466668 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:01:02.466638 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa20cdef-be4d-4406-a540-daac82e869e6" path="/var/lib/kubelet/pods/aa20cdef-be4d-4406-a540-daac82e869e6/volumes" Apr 20 15:01:56.480680 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:01:56.480651 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 15:01:56.483030 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:01:56.483003 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x29sq_a649d03f-abf5-42ca-849e-903f5fdc0299/ovn-acl-logging/0.log" Apr 20 15:02:51.764767 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:02:51.764693 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7b7b77666-x6l4l_7952c393-a501-4432-8968-966fdce3b2a6/authorino/0.log" Apr 20 15:02:56.108872 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:02:56.108839 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-65c545df94-bmvsb_e712a1b7-aae0-4453-b6d1-13e91b06477d/manager/0.log" Apr 20 15:02:57.754047 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:02:57.754018 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7b7b77666-x6l4l_7952c393-a501-4432-8968-966fdce3b2a6/authorino/0.log" Apr 20 15:02:57.882199 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:02:57.882173 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-rrtcj_e66278c8-aaf6-44bd-b6be-098b50d2d90e/manager/0.log" Apr 20 15:02:58.016385 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:02:58.016328 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-jx76k_60de93f7-8e60-4bae-870d-deb50faba7d5/manager/0.log" Apr 20 15:02:58.297933 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:02:58.297853 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-jjmkz_f5cce871-f463-4954-8e13-a8ed9232b69a/registry-server/0.log" Apr 20 15:02:58.641574 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:02:58.641502 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-bjpj4_03712bcc-6d66-4b0e-8fc2-75f85bdf622b/manager/0.log" Apr 20 15:02:59.008039 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:02:59.008014 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5_0e9100e6-cd2c-43b5-999b-a00fa7a5048c/istio-proxy/0.log" Apr 20 15:02:59.498946 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:02:59.498915 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-ldmzg_9b585b4d-a336-46c5-b72f-94313026996d/istio-proxy/0.log" Apr 20 15:03:00.296489 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:00.296451 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs_ed286892-c718-4998-a4ce-a17cf72e45c6/storage-initializer/0.log" Apr 20 15:03:00.305302 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:00.305274 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccql8xs_ed286892-c718-4998-a4ce-a17cf72e45c6/main/0.log" Apr 20 15:03:04.541064 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.541031 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xvzk6/must-gather-n4kw4"] Apr 20 15:03:04.541474 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.541352 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa20cdef-be4d-4406-a540-daac82e869e6" containerName="cleanup" Apr 20 15:03:04.541474 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.541363 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa20cdef-be4d-4406-a540-daac82e869e6" containerName="cleanup" Apr 20 15:03:04.541474 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.541373 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa20cdef-be4d-4406-a540-daac82e869e6" containerName="cleanup" Apr 20 15:03:04.541474 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.541379 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa20cdef-be4d-4406-a540-daac82e869e6" containerName="cleanup" Apr 20 15:03:04.541474 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.541433 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa20cdef-be4d-4406-a540-daac82e869e6" containerName="cleanup" Apr 20 15:03:04.541706 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.541525 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa20cdef-be4d-4406-a540-daac82e869e6" containerName="cleanup" Apr 20 15:03:04.543476 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.543458 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xvzk6/must-gather-n4kw4" Apr 20 15:03:04.545985 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.545958 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xvzk6\"/\"default-dockercfg-5f9zf\"" Apr 20 15:03:04.546101 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.545958 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xvzk6\"/\"openshift-service-ca.crt\"" Apr 20 15:03:04.546101 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.545996 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xvzk6\"/\"kube-root-ca.crt\"" Apr 20 15:03:04.560900 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.560880 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xvzk6/must-gather-n4kw4"] Apr 20 15:03:04.702396 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.702362 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/020d9e79-7997-416f-b05f-6771646f53de-must-gather-output\") pod \"must-gather-n4kw4\" (UID: \"020d9e79-7997-416f-b05f-6771646f53de\") " pod="openshift-must-gather-xvzk6/must-gather-n4kw4" Apr 20 15:03:04.702567 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.702411 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfdm7\" (UniqueName: \"kubernetes.io/projected/020d9e79-7997-416f-b05f-6771646f53de-kube-api-access-zfdm7\") pod \"must-gather-n4kw4\" (UID: \"020d9e79-7997-416f-b05f-6771646f53de\") " pod="openshift-must-gather-xvzk6/must-gather-n4kw4" Apr 20 15:03:04.803680 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.803582 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfdm7\" (UniqueName: \"kubernetes.io/projected/020d9e79-7997-416f-b05f-6771646f53de-kube-api-access-zfdm7\") pod \"must-gather-n4kw4\" (UID: \"020d9e79-7997-416f-b05f-6771646f53de\") " pod="openshift-must-gather-xvzk6/must-gather-n4kw4" Apr 20 15:03:04.803851 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.803712 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/020d9e79-7997-416f-b05f-6771646f53de-must-gather-output\") pod \"must-gather-n4kw4\" (UID: \"020d9e79-7997-416f-b05f-6771646f53de\") " pod="openshift-must-gather-xvzk6/must-gather-n4kw4" Apr 20 15:03:04.804053 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.804032 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/020d9e79-7997-416f-b05f-6771646f53de-must-gather-output\") pod \"must-gather-n4kw4\" (UID: \"020d9e79-7997-416f-b05f-6771646f53de\") " pod="openshift-must-gather-xvzk6/must-gather-n4kw4" Apr 20 15:03:04.811963 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.811940 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfdm7\" (UniqueName: \"kubernetes.io/projected/020d9e79-7997-416f-b05f-6771646f53de-kube-api-access-zfdm7\") pod \"must-gather-n4kw4\" (UID: \"020d9e79-7997-416f-b05f-6771646f53de\") " pod="openshift-must-gather-xvzk6/must-gather-n4kw4" Apr 20 15:03:04.852908 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:04.852881 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xvzk6/must-gather-n4kw4" Apr 20 15:03:05.178351 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:05.178328 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xvzk6/must-gather-n4kw4"] Apr 20 15:03:05.181076 ip-10-0-140-30 kubenswrapper[2572]: W0420 15:03:05.181049 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod020d9e79_7997_416f_b05f_6771646f53de.slice/crio-9f9e0cbdfe31fc9786ffac884db7c812576d5f5b3c3b39fdd810727f57b94748 WatchSource:0}: Error finding container 9f9e0cbdfe31fc9786ffac884db7c812576d5f5b3c3b39fdd810727f57b94748: Status 404 returned error can't find the container with id 9f9e0cbdfe31fc9786ffac884db7c812576d5f5b3c3b39fdd810727f57b94748 Apr 20 15:03:05.387656 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:05.387622 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xvzk6/must-gather-n4kw4" event={"ID":"020d9e79-7997-416f-b05f-6771646f53de","Type":"ContainerStarted","Data":"9f9e0cbdfe31fc9786ffac884db7c812576d5f5b3c3b39fdd810727f57b94748"} Apr 20 15:03:07.400392 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:07.400350 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xvzk6/must-gather-n4kw4" event={"ID":"020d9e79-7997-416f-b05f-6771646f53de","Type":"ContainerStarted","Data":"8e9282c27a1871def42d6e7219eb87c9f44c8588efdbca5a9a61f163a9eec161"} Apr 20 15:03:07.400859 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:07.400398 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xvzk6/must-gather-n4kw4" event={"ID":"020d9e79-7997-416f-b05f-6771646f53de","Type":"ContainerStarted","Data":"6d3a146bdcce2355a1bdb3a0f5ab4a3f0efabb083ca4a3668a72f4b19518d256"} Apr 20 15:03:07.417678 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:07.417629 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xvzk6/must-gather-n4kw4" podStartSLOduration=2.296156355 podStartE2EDuration="3.41761621s" podCreationTimestamp="2026-04-20 15:03:04 +0000 UTC" firstStartedPulling="2026-04-20 15:03:05.183139876 +0000 UTC m=+2169.390132295" lastFinishedPulling="2026-04-20 15:03:06.304599726 +0000 UTC m=+2170.511592150" observedRunningTime="2026-04-20 15:03:07.415194326 +0000 UTC m=+2171.622186776" watchObservedRunningTime="2026-04-20 15:03:07.41761621 +0000 UTC m=+2171.624608649" Apr 20 15:03:07.935793 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:07.935759 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vkzx4_5fc6b1ee-788e-4792-8454-48a99a628442/global-pull-secret-syncer/0.log" Apr 20 15:03:08.044676 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:08.044647 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-p697p_35778b02-a4c0-4a27-907b-2c96d1273465/konnectivity-agent/0.log" Apr 20 15:03:08.147825 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:08.147799 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-30.ec2.internal_c8ce0f8ff8898a2f19fd2c9a4c4f3273/haproxy/0.log" Apr 20 15:03:12.693066 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:12.692963 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7b7b77666-x6l4l_7952c393-a501-4432-8968-966fdce3b2a6/authorino/0.log" Apr 20 15:03:12.729033 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:12.728997 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-rrtcj_e66278c8-aaf6-44bd-b6be-098b50d2d90e/manager/0.log" Apr 20 15:03:12.754010 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:12.753974 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-jx76k_60de93f7-8e60-4bae-870d-deb50faba7d5/manager/0.log" Apr 20 15:03:12.819424 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:12.819326 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-jjmkz_f5cce871-f463-4954-8e13-a8ed9232b69a/registry-server/0.log" Apr 20 15:03:12.974500 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:12.974459 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-bjpj4_03712bcc-6d66-4b0e-8fc2-75f85bdf622b/manager/0.log" Apr 20 15:03:14.691358 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:14.691324 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-66f6c54c88-wz6n8_94161a15-8d5c-42b3-b76b-d8199963c145/metrics-server/0.log" Apr 20 15:03:15.012731 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:15.012702 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vp8nl_41c9c8fd-f140-48b8-95f6-02067151be5a/node-exporter/0.log" Apr 20 15:03:15.036804 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:15.036751 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vp8nl_41c9c8fd-f140-48b8-95f6-02067151be5a/kube-rbac-proxy/0.log" Apr 20 15:03:15.064602 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:15.064566 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vp8nl_41c9c8fd-f140-48b8-95f6-02067151be5a/init-textfile/0.log" Apr 20 15:03:15.509184 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:15.509120 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8445558856-t2vgl_45644160-70a2-4887-b258-6c0a50fd530c/thanos-query/0.log" Apr 20 15:03:15.533918 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:15.533887 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8445558856-t2vgl_45644160-70a2-4887-b258-6c0a50fd530c/kube-rbac-proxy-web/0.log" Apr 20 15:03:15.556387 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:15.556358 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8445558856-t2vgl_45644160-70a2-4887-b258-6c0a50fd530c/kube-rbac-proxy/0.log" Apr 20 15:03:15.579318 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:15.579288 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8445558856-t2vgl_45644160-70a2-4887-b258-6c0a50fd530c/prom-label-proxy/0.log" Apr 20 15:03:15.602859 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:15.602831 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8445558856-t2vgl_45644160-70a2-4887-b258-6c0a50fd530c/kube-rbac-proxy-rules/0.log" Apr 20 15:03:15.626584 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:15.626557 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8445558856-t2vgl_45644160-70a2-4887-b258-6c0a50fd530c/kube-rbac-proxy-metrics/0.log" Apr 20 15:03:16.789504 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:16.789473 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs"] Apr 20 15:03:16.796048 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:16.796016 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:16.803198 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:16.803169 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs"] Apr 20 15:03:16.921791 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:16.921753 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fc58aa3f-d9a1-460a-8504-82d76f59df27-proc\") pod \"perf-node-gather-daemonset-jmgvs\" (UID: \"fc58aa3f-d9a1-460a-8504-82d76f59df27\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:16.921971 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:16.921844 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fc58aa3f-d9a1-460a-8504-82d76f59df27-podres\") pod \"perf-node-gather-daemonset-jmgvs\" (UID: \"fc58aa3f-d9a1-460a-8504-82d76f59df27\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:16.921971 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:16.921908 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llrfv\" (UniqueName: \"kubernetes.io/projected/fc58aa3f-d9a1-460a-8504-82d76f59df27-kube-api-access-llrfv\") pod \"perf-node-gather-daemonset-jmgvs\" (UID: \"fc58aa3f-d9a1-460a-8504-82d76f59df27\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:16.921971 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:16.921946 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc58aa3f-d9a1-460a-8504-82d76f59df27-sys\") pod \"perf-node-gather-daemonset-jmgvs\" (UID: \"fc58aa3f-d9a1-460a-8504-82d76f59df27\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:16.922145 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:16.921970 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc58aa3f-d9a1-460a-8504-82d76f59df27-lib-modules\") pod \"perf-node-gather-daemonset-jmgvs\" (UID: \"fc58aa3f-d9a1-460a-8504-82d76f59df27\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:17.022914 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:17.022881 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc58aa3f-d9a1-460a-8504-82d76f59df27-sys\") pod \"perf-node-gather-daemonset-jmgvs\" (UID: \"fc58aa3f-d9a1-460a-8504-82d76f59df27\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:17.022914 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:17.022917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc58aa3f-d9a1-460a-8504-82d76f59df27-lib-modules\") pod \"perf-node-gather-daemonset-jmgvs\" (UID: \"fc58aa3f-d9a1-460a-8504-82d76f59df27\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:17.023160 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:17.022962 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fc58aa3f-d9a1-460a-8504-82d76f59df27-proc\") pod \"perf-node-gather-daemonset-jmgvs\" (UID: \"fc58aa3f-d9a1-460a-8504-82d76f59df27\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:17.023160 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:17.022986 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fc58aa3f-d9a1-460a-8504-82d76f59df27-podres\") pod \"perf-node-gather-daemonset-jmgvs\" (UID: \"fc58aa3f-d9a1-460a-8504-82d76f59df27\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:17.023160 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:17.023014 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llrfv\" (UniqueName: \"kubernetes.io/projected/fc58aa3f-d9a1-460a-8504-82d76f59df27-kube-api-access-llrfv\") pod \"perf-node-gather-daemonset-jmgvs\" (UID: \"fc58aa3f-d9a1-460a-8504-82d76f59df27\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:17.023160 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:17.023014 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc58aa3f-d9a1-460a-8504-82d76f59df27-sys\") pod \"perf-node-gather-daemonset-jmgvs\" (UID: \"fc58aa3f-d9a1-460a-8504-82d76f59df27\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:17.023160 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:17.023084 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fc58aa3f-d9a1-460a-8504-82d76f59df27-proc\") pod \"perf-node-gather-daemonset-jmgvs\" (UID: \"fc58aa3f-d9a1-460a-8504-82d76f59df27\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:17.023334 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:17.023180 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc58aa3f-d9a1-460a-8504-82d76f59df27-lib-modules\") pod \"perf-node-gather-daemonset-jmgvs\" (UID: \"fc58aa3f-d9a1-460a-8504-82d76f59df27\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:17.023334 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:17.023219 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fc58aa3f-d9a1-460a-8504-82d76f59df27-podres\") pod \"perf-node-gather-daemonset-jmgvs\" (UID: \"fc58aa3f-d9a1-460a-8504-82d76f59df27\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:17.033332 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:17.033275 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llrfv\" (UniqueName: \"kubernetes.io/projected/fc58aa3f-d9a1-460a-8504-82d76f59df27-kube-api-access-llrfv\") pod \"perf-node-gather-daemonset-jmgvs\" (UID: \"fc58aa3f-d9a1-460a-8504-82d76f59df27\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:17.109960 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:17.109454 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:17.287766 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:17.286657 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs"] Apr 20 15:03:17.291620 ip-10-0-140-30 kubenswrapper[2572]: W0420 15:03:17.291571 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfc58aa3f_d9a1_460a_8504_82d76f59df27.slice/crio-d06537e45247eceff50d3a071bfd5b3c1477e6c27d1efdf278c506c3db7a3446 WatchSource:0}: Error finding container d06537e45247eceff50d3a071bfd5b3c1477e6c27d1efdf278c506c3db7a3446: Status 404 returned error can't find the container with id d06537e45247eceff50d3a071bfd5b3c1477e6c27d1efdf278c506c3db7a3446 Apr 20 15:03:17.444570 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:17.444539 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" event={"ID":"fc58aa3f-d9a1-460a-8504-82d76f59df27","Type":"ContainerStarted","Data":"bc705e0b483e1d9511d791283cd1bbd4efe469334f901e0a7414a1eaa97c086f"} Apr 20 15:03:17.444570 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:17.444574 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" event={"ID":"fc58aa3f-d9a1-460a-8504-82d76f59df27","Type":"ContainerStarted","Data":"d06537e45247eceff50d3a071bfd5b3c1477e6c27d1efdf278c506c3db7a3446"} Apr 20 15:03:17.444848 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:17.444615 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:17.462648 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:17.462587 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" podStartSLOduration=1.4625701119999999 podStartE2EDuration="1.462570112s" podCreationTimestamp="2026-04-20 15:03:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:03:17.460601049 +0000 UTC m=+2181.667593487" watchObservedRunningTime="2026-04-20 15:03:17.462570112 +0000 UTC m=+2181.669562668" Apr 20 15:03:19.039786 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:19.039749 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6ln9g_ed9eaae0-abf3-4e1a-89f2-760f6e63f14a/dns/0.log" Apr 20 15:03:19.067413 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:19.067374 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6ln9g_ed9eaae0-abf3-4e1a-89f2-760f6e63f14a/kube-rbac-proxy/0.log" Apr 20 15:03:19.217920 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:19.217887 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w77dh_26818813-da84-407b-b55f-77d9ffcbb474/dns-node-resolver/0.log" Apr 20 15:03:19.789340 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:19.789311 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-f759k_db0116b2-344f-40ec-a0a2-b47e8cb06248/node-ca/0.log" Apr 20 15:03:20.759141 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:20.759084 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfdmjb5_0e9100e6-cd2c-43b5-999b-a00fa7a5048c/istio-proxy/0.log" Apr 20 15:03:20.966619 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:20.966583 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-ldmzg_9b585b4d-a336-46c5-b72f-94313026996d/istio-proxy/0.log" Apr 20 15:03:21.581347 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:21.581322 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-b2q9g_2e1b2d6d-33b7-4a6a-97a6-47d7a6c938e4/serve-healthcheck-canary/0.log" Apr 20 15:03:22.091033 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:22.091001 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2jkbs_78b75644-9f8e-441d-9ce9-45bc435b888f/kube-rbac-proxy/0.log" Apr 20 15:03:22.160138 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:22.160108 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2jkbs_78b75644-9f8e-441d-9ce9-45bc435b888f/exporter/0.log" Apr 20 15:03:22.197946 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:22.197920 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2jkbs_78b75644-9f8e-441d-9ce9-45bc435b888f/extractor/0.log" Apr 20 15:03:23.458502 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:23.458474 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-jmgvs" Apr 20 15:03:24.584616 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:24.584567 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-65c545df94-bmvsb_e712a1b7-aae0-4453-b6d1-13e91b06477d/manager/0.log" Apr 20 15:03:26.101099 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:26.101073 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-bc7d4767f-lmgxk_a7afbeac-dacc-4295-a2a3-d5072f22a1b1/manager/0.log" Apr 20 15:03:26.132002 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:26.131903 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-m8tbv_d59a3530-dca6-42a2-a64e-9c2a300e2fe9/openshift-lws-operator/0.log" Apr 20 15:03:31.980161 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:31.980119 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9gnvx_64d6d293-736d-4b7e-86ca-9b3decf1c068/kube-multus/0.log" Apr 20 15:03:32.007549 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:32.007520 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9kgc2_9798cc7a-2b24-4609-adaf-fd0cb6fa296b/kube-multus-additional-cni-plugins/0.log" Apr 20 15:03:32.029190 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:32.029162 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9kgc2_9798cc7a-2b24-4609-adaf-fd0cb6fa296b/egress-router-binary-copy/0.log" Apr 20 15:03:32.050796 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:32.050767 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9kgc2_9798cc7a-2b24-4609-adaf-fd0cb6fa296b/cni-plugins/0.log" Apr 20 15:03:32.074578 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:32.074552 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9kgc2_9798cc7a-2b24-4609-adaf-fd0cb6fa296b/bond-cni-plugin/0.log" Apr 20 15:03:32.097255 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:32.097225 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9kgc2_9798cc7a-2b24-4609-adaf-fd0cb6fa296b/routeoverride-cni/0.log" Apr 20 15:03:32.121919 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:32.121899 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9kgc2_9798cc7a-2b24-4609-adaf-fd0cb6fa296b/whereabouts-cni-bincopy/0.log" Apr 20 15:03:32.144474 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:32.144447 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9kgc2_9798cc7a-2b24-4609-adaf-fd0cb6fa296b/whereabouts-cni/0.log" Apr 20 15:03:32.554430 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:32.554405 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-g7tjl_19b3975f-609f-427a-a428-db9cb8176eec/network-metrics-daemon/0.log" Apr 20 15:03:32.578104 ip-10-0-140-30 kubenswrapper[2572]: I0420 15:03:32.578076 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-g7tjl_19b3975f-609f-427a-a428-db9cb8176eec/kube-rbac-proxy/0.log"