Apr 20 20:11:08.128336 ip-10-0-141-183 systemd[1]: Starting Kubernetes Kubelet... Apr 20 20:11:08.564710 ip-10-0-141-183 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:11:08.564710 ip-10-0-141-183 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 20:11:08.564710 ip-10-0-141-183 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:11:08.564710 ip-10-0-141-183 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 20:11:08.564710 ip-10-0-141-183 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:11:08.567147 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.567043 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 20:11:08.571894 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571848 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:11:08.571894 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571881 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:11:08.571894 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571887 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:11:08.571894 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571892 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:11:08.571894 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571895 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:11:08.571894 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571899 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:11:08.571894 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571905 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:11:08.571894 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571908 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:11:08.571894 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571912 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571916 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571920 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571923 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571927 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571930 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571933 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571937 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571940 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571944 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571964 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571968 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571971 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571975 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571981 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571985 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571989 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571993 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.571997 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:11:08.572396 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572002 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572006 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572009 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572013 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572017 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572021 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572025 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572029 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572033 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572037 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572041 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572044 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572048 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572052 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572056 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572061 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572065 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572069 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572075 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572078 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:11:08.573149 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572085 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572091 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572098 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572103 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572107 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572111 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572116 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572120 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572124 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572131 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572139 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572144 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572149 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572153 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572157 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572161 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572166 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572170 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572175 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:11:08.573893 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572179 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572183 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572189 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572193 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572198 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572202 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572208 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572213 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572217 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572222 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572226 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572231 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572235 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572239 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572244 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572249 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572253 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572257 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572261 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.572265 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.573916 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:11:08.574593 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.573935 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.573942 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.573967 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.573972 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.573976 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.573981 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.573986 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.573990 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574002 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574006 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574011 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574015 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574020 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574026 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574030 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574035 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574055 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574077 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574084 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574090 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:11:08.575155 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574096 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574100 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574105 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574123 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574152 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574156 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574175 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574328 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574337 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574340 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574343 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574346 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574349 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574351 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574354 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574357 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574359 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574362 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574365 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574367 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:11:08.575661 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574369 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574372 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574374 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574381 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574384 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574386 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574389 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574392 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574395 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574403 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574406 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574408 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574411 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574413 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574416 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574418 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574421 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574426 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574429 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574432 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:11:08.576220 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574434 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574437 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574440 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574443 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574445 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574448 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574450 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574454 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574458 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574461 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574464 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574467 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574471 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574475 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574478 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574481 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574484 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574486 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574489 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:11:08.576694 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574494 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574496 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574500 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574503 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574506 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.574509 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575219 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575231 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575238 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575243 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575248 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575251 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575256 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575261 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575264 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575267 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575270 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575274 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575277 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575281 2580 flags.go:64] FLAG: --cgroup-root="" Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575284 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575287 2580 flags.go:64] FLAG: --client-ca-file="" Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575290 2580 flags.go:64] FLAG: --cloud-config="" Apr 20 20:11:08.577174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575293 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575296 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575302 2580 flags.go:64] FLAG: --cluster-domain="" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575305 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575308 2580 flags.go:64] FLAG: --config-dir="" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575311 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575314 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575319 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575322 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575325 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575328 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575332 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575335 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575338 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575341 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575344 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575349 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575352 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575355 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575358 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575361 2580 flags.go:64] FLAG: --enable-server="true" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575364 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575369 2580 flags.go:64] FLAG: --event-burst="100" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575373 2580 flags.go:64] FLAG: --event-qps="50" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575376 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 20:11:08.577729 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575379 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575382 2580 flags.go:64] FLAG: --eviction-hard="" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575386 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575388 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575391 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575394 2580 flags.go:64] FLAG: --eviction-soft="" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575397 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575400 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575403 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575406 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575409 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575412 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575415 2580 flags.go:64] FLAG: --feature-gates="" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575419 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575422 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575425 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575429 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575432 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575435 2580 flags.go:64] FLAG: --help="false" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575438 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-141-183.ec2.internal" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575441 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575445 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575447 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575454 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 20:11:08.578352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575457 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575460 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575463 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575465 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575468 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575471 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575475 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575478 2580 flags.go:64] FLAG: --kube-reserved="" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575481 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575484 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575487 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575489 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575492 2580 flags.go:64] FLAG: --lock-file="" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575495 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575498 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575501 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575506 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575509 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575512 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575515 2580 flags.go:64] FLAG: --logging-format="text" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575517 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575521 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575524 2580 flags.go:64] FLAG: --manifest-url="" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575527 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575532 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575536 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 20:11:08.579003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575540 2580 flags.go:64] FLAG: --max-pods="110" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575543 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575547 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575550 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575553 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575560 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575563 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575566 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575575 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575578 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575581 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575584 2580 flags.go:64] FLAG: --pod-cidr="" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575587 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575592 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575595 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575599 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575602 2580 flags.go:64] FLAG: --port="10250" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575605 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575608 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ef699514cf9d287b" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575611 2580 flags.go:64] FLAG: --qos-reserved="" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575614 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575617 2580 flags.go:64] FLAG: --register-node="true" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575619 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575622 2580 flags.go:64] FLAG: --register-with-taints="" Apr 20 20:11:08.579633 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575626 2580 flags.go:64] FLAG: --registry-burst="10" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575629 2580 flags.go:64] FLAG: --registry-qps="5" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575632 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575635 2580 flags.go:64] FLAG: --reserved-memory="" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575638 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575641 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575644 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575647 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575651 2580 flags.go:64] FLAG: --runonce="false" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575654 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575657 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575660 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575662 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575667 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575670 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575673 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575676 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575679 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575682 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575685 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575688 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575691 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575694 2580 flags.go:64] FLAG: --system-cgroups="" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575697 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575703 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 20:11:08.580323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575706 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575709 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575714 2580 flags.go:64] FLAG: --tls-min-version="" Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575716 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575719 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575722 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575725 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575729 2580 flags.go:64] FLAG: --v="2" Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575733 2580 flags.go:64] FLAG: --version="false" Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575738 2580 flags.go:64] FLAG: --vmodule="" Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575742 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.575746 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575843 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575847 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575850 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575854 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575857 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575859 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575862 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575864 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575869 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575871 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:11:08.580935 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575874 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575876 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575879 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575882 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575884 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575887 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575890 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575892 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575895 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575898 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575900 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575903 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575905 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575908 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575911 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575913 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575915 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575918 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575920 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:11:08.581496 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575924 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575928 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575931 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575935 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575938 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575941 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575944 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575962 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575966 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575968 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575971 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575975 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575978 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575981 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575984 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575987 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575989 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575992 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575995 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.575997 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:11:08.582055 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576000 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576003 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576005 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576008 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576010 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576013 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576016 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576019 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576021 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576024 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576026 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576029 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576032 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576034 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576037 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576041 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576044 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576047 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576049 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576052 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:11:08.582557 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576054 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:11:08.583097 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576057 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:11:08.583097 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576060 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:11:08.583097 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576064 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:11:08.583097 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576067 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:11:08.583097 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576069 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:11:08.583097 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576072 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:11:08.583097 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576076 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:11:08.583097 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576080 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:11:08.583097 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576083 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:11:08.583097 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576085 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:11:08.583097 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576088 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:11:08.583097 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576091 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:11:08.583097 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576093 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:11:08.583097 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576096 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:11:08.583097 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576099 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:11:08.583097 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.576102 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:11:08.583496 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.576112 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:11:08.584199 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.584170 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 20:11:08.584240 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.584200 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 20:11:08.584271 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584256 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:11:08.584271 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584262 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:11:08.584271 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584265 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:11:08.584271 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584268 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:11:08.584271 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584271 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584275 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584278 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584281 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584285 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584287 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584290 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584294 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584299 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584304 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584307 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584310 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584313 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584316 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584319 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584321 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584324 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584327 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584329 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:11:08.584398 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584332 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584334 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584337 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584340 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584342 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584345 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584347 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584350 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584352 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584355 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584357 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584360 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584362 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584365 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584368 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584371 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584374 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584377 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584379 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584382 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:11:08.584851 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584384 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584387 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584389 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584392 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584394 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584397 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584400 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584402 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584405 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584407 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584410 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584412 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584415 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584418 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584420 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584423 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584425 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584428 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584430 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584433 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:11:08.585351 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584436 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584438 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584441 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584443 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584446 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584448 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584452 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584455 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584457 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584460 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584463 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584466 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584468 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584471 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584473 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584476 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584485 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584487 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584490 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584493 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:11:08.585843 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584495 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:11:08.586380 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584498 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:11:08.586380 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584501 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:11:08.586380 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.584507 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:11:08.586380 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584613 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:11:08.586380 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584617 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:11:08.586380 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584620 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:11:08.586380 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584623 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:11:08.586380 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584625 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:11:08.586380 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584628 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:11:08.586380 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584631 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:11:08.586380 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584634 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:11:08.586380 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584637 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:11:08.586380 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584639 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:11:08.586380 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584642 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:11:08.586380 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584644 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:11:08.586380 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584647 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584649 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584651 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584654 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584657 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584660 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584662 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584665 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584667 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584670 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584672 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584675 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584678 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584681 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584684 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584687 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584691 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584693 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584696 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584699 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:11:08.586787 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584701 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584704 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584706 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584708 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584711 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584713 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584716 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584718 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584720 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584723 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584725 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584729 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584731 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584733 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584736 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584739 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584742 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584745 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584747 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584750 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:11:08.587325 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584752 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584755 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584758 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584760 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584763 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584765 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584769 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584771 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584774 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584776 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584779 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584781 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584784 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584786 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584789 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584791 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584793 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584796 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584798 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584800 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:11:08.587832 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584803 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:11:08.588366 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584805 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:11:08.588366 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584807 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:11:08.588366 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584812 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:11:08.588366 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584815 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:11:08.588366 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584818 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:11:08.588366 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584821 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:11:08.588366 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584825 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:11:08.588366 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584828 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:11:08.588366 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584831 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:11:08.588366 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584834 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:11:08.588366 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584836 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:11:08.588366 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584839 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:11:08.588366 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:08.584841 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:11:08.588366 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.584847 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:11:08.588366 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.585619 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 20:11:08.588738 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.588686 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 20:11:08.589777 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.589763 2580 server.go:1019] "Starting client certificate rotation" Apr 20 20:11:08.589889 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.589869 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:11:08.589925 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.589909 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:11:08.613502 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.613466 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:11:08.616345 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.616316 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:11:08.632930 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.632896 2580 log.go:25] "Validated CRI v1 runtime API" Apr 20 20:11:08.638846 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.638823 2580 log.go:25] "Validated CRI v1 image API" Apr 20 20:11:08.639557 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.639535 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:11:08.640312 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.640295 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 20:11:08.645995 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.645966 2580 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 863b15c6-43e0-4d73-b539-51577696c9a8:/dev/nvme0n1p4 e29ac1f9-9af4-4958-a336-a65633365862:/dev/nvme0n1p3] Apr 20 20:11:08.645995 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.645991 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 20:11:08.652569 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.652443 2580 manager.go:217] Machine: {Timestamp:2026-04-20 20:11:08.650641333 +0000 UTC m=+0.406011854 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3114627 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2490113180158628af181ddcef6383 SystemUUID:ec249011-3180-1586-28af-181ddcef6383 BootID:47d2e23e-c8b2-410c-a020-76b2a010a20b Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:bb:1a:92:d9:9d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:bb:1a:92:d9:9d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2e:6c:a6:69:29:70 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 20:11:08.652569 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.652561 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 20:11:08.652707 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.652695 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 20:11:08.653804 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.653778 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 20:11:08.653981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.653805 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-183.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 20:11:08.654032 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.653992 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 20:11:08.654032 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.654001 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 20:11:08.654032 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.654014 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:11:08.654698 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.654686 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:11:08.656088 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.656076 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:11:08.656400 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.656389 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 20:11:08.659496 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.659482 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 20 20:11:08.659560 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.659506 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 20:11:08.659560 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.659524 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 20:11:08.659560 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.659535 2580 kubelet.go:397] "Adding apiserver pod source" Apr 20 20:11:08.659560 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.659547 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 20:11:08.660431 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.660414 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lhjrr" Apr 20 20:11:08.660792 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.660776 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:11:08.660848 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.660809 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:11:08.664052 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.664035 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 20:11:08.665842 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.665828 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 20:11:08.666103 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.666089 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lhjrr" Apr 20 20:11:08.667465 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.667453 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 20:11:08.667518 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.667473 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 20:11:08.667518 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.667482 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 20:11:08.667518 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.667490 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 20:11:08.667518 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.667499 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 20:11:08.667518 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.667506 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 20:11:08.667518 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.667513 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 20:11:08.667518 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.667518 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 20:11:08.667690 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.667525 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 20:11:08.667690 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.667531 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 20:11:08.667690 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.667540 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 20:11:08.667690 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.667549 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 20:11:08.669650 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.669639 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 20:11:08.669690 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.669652 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 20:11:08.673547 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.673534 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 20:11:08.673597 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.673574 2580 server.go:1295] "Started kubelet" Apr 20 20:11:08.673807 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.673762 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 20:11:08.673862 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.673772 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 20:11:08.673909 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.673862 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 20:11:08.675786 ip-10-0-141-183 systemd[1]: Started Kubernetes Kubelet. Apr 20 20:11:08.677212 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.676991 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 20 20:11:08.677415 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.677393 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:08.677788 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.677767 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 20:11:08.683865 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.683843 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:08.685113 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.685088 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 20:11:08.685211 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.685096 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 20:11:08.686243 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.685894 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 20:11:08.686243 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.685912 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 20:11:08.686243 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.686023 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 20:11:08.686399 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.686259 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 20 20:11:08.686399 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.686267 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 20 20:11:08.686399 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.686360 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 20:11:08.686399 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.686376 2580 factory.go:55] Registering systemd factory Apr 20 20:11:08.686399 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.686384 2580 factory.go:223] Registration of the systemd container factory successfully Apr 20 20:11:08.686533 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:08.686383 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-183.ec2.internal\" not found" Apr 20 20:11:08.686870 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.686845 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-183.ec2.internal" not found Apr 20 20:11:08.686984 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.686856 2580 factory.go:153] Registering CRI-O factory Apr 20 20:11:08.687088 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.687070 2580 factory.go:223] Registration of the crio container factory successfully Apr 20 20:11:08.687170 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.687107 2580 factory.go:103] Registering Raw factory Apr 20 20:11:08.687170 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.687122 2580 manager.go:1196] Started watching for new ooms in manager Apr 20 20:11:08.687616 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.687598 2580 manager.go:319] Starting recovery of all containers Apr 20 20:11:08.687717 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.687706 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:08.689154 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:08.689111 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 20:11:08.691976 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:08.691935 2580 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-183.ec2.internal\" not found" node="ip-10-0-141-183.ec2.internal" Apr 20 20:11:08.693362 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.693329 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 20:11:08.697791 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.697772 2580 manager.go:324] Recovery completed Apr 20 20:11:08.702143 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.702125 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:11:08.702591 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.702575 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-183.ec2.internal" not found Apr 20 20:11:08.705714 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.705698 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-183.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:11:08.705785 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.705729 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-183.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:11:08.705785 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.705744 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-183.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:11:08.706307 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.706292 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 20:11:08.706307 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.706303 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 20:11:08.706434 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.706321 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:11:08.708462 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.708450 2580 policy_none.go:49] "None policy: Start" Apr 20 20:11:08.708514 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.708467 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 20:11:08.708514 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.708477 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 20 20:11:08.747751 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.747728 2580 manager.go:341] "Starting Device Plugin manager" Apr 20 20:11:08.756046 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:08.747850 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 20:11:08.756046 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.747865 2580 server.go:85] "Starting device plugin registration server" Apr 20 20:11:08.756046 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.748186 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 20:11:08.756046 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.748198 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 20:11:08.756046 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.748298 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 20:11:08.756046 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.748388 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 20:11:08.756046 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.748398 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 20:11:08.756046 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:08.749001 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 20:11:08.756046 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:08.749051 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-183.ec2.internal\" not found" Apr 20 20:11:08.763308 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.763281 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-183.ec2.internal" not found Apr 20 20:11:08.819744 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.819655 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 20:11:08.819744 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.819694 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 20:11:08.819744 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.819718 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 20:11:08.819744 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.819725 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 20:11:08.819991 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:08.819766 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 20:11:08.823176 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.823148 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:08.849318 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.849289 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:11:08.850527 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.850506 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-183.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:11:08.850627 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.850545 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-183.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:11:08.850627 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.850560 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-183.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:11:08.850627 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.850584 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-183.ec2.internal" Apr 20 20:11:08.859696 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.859663 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-183.ec2.internal" Apr 20 20:11:08.920390 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.920348 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-183.ec2.internal"] Apr 20 20:11:08.922813 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.922790 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal" Apr 20 20:11:08.922898 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.922793 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-183.ec2.internal" Apr 20 20:11:08.948913 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.948882 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal" Apr 20 20:11:08.953576 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.953558 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-183.ec2.internal" Apr 20 20:11:08.961150 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.961123 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:11:08.969753 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.969722 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:11:08.987521 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.987485 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1816a504f93ff56ac217cacae748395-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal\" (UID: \"e1816a504f93ff56ac217cacae748395\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal" Apr 20 20:11:08.987521 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.987520 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/01a299fcf64ba455a08a0abdcaf7bcdd-config\") pod \"kube-apiserver-proxy-ip-10-0-141-183.ec2.internal\" (UID: \"01a299fcf64ba455a08a0abdcaf7bcdd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-183.ec2.internal" Apr 20 20:11:08.987715 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:08.987542 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e1816a504f93ff56ac217cacae748395-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal\" (UID: \"e1816a504f93ff56ac217cacae748395\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal" Apr 20 20:11:09.088473 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.088373 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1816a504f93ff56ac217cacae748395-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal\" (UID: \"e1816a504f93ff56ac217cacae748395\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal" Apr 20 20:11:09.088473 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.088410 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/01a299fcf64ba455a08a0abdcaf7bcdd-config\") pod \"kube-apiserver-proxy-ip-10-0-141-183.ec2.internal\" (UID: \"01a299fcf64ba455a08a0abdcaf7bcdd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-183.ec2.internal" Apr 20 20:11:09.088473 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.088428 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e1816a504f93ff56ac217cacae748395-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal\" (UID: \"e1816a504f93ff56ac217cacae748395\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal" Apr 20 20:11:09.088684 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.088491 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e1816a504f93ff56ac217cacae748395-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal\" (UID: \"e1816a504f93ff56ac217cacae748395\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal" Apr 20 20:11:09.088684 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.088501 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1816a504f93ff56ac217cacae748395-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal\" (UID: \"e1816a504f93ff56ac217cacae748395\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal" Apr 20 20:11:09.088684 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.088491 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/01a299fcf64ba455a08a0abdcaf7bcdd-config\") pod \"kube-apiserver-proxy-ip-10-0-141-183.ec2.internal\" (UID: \"01a299fcf64ba455a08a0abdcaf7bcdd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-183.ec2.internal" Apr 20 20:11:09.264230 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.264187 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-183.ec2.internal" Apr 20 20:11:09.272842 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.272812 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal" Apr 20 20:11:09.589502 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.589476 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 20:11:09.590352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.589693 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:11:09.590352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.589700 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:11:09.590352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.589699 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:11:09.660124 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.660083 2580 apiserver.go:52] "Watching apiserver" Apr 20 20:11:09.667585 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.667533 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 20:06:08 +0000 UTC" deadline="2027-09-16 20:47:09.921091971 +0000 UTC" Apr 20 20:11:09.667585 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.667577 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12336h36m0.253519171s" Apr 20 20:11:09.668814 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.668797 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 20:11:09.670904 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.669146 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt","openshift-dns/node-resolver-d96qr","openshift-image-registry/node-ca-qgfkv","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal","openshift-ovn-kubernetes/ovnkube-node-k9hml","openshift-cluster-node-tuning-operator/tuned-qf9zb","openshift-multus/multus-additional-cni-plugins-mql6h","openshift-multus/multus-jfpm5","openshift-multus/network-metrics-daemon-z9tzr","openshift-network-diagnostics/network-check-target-hftxf","openshift-network-operator/iptables-alerter-hs9bp","kube-system/konnectivity-agent-vrxxm","kube-system/kube-apiserver-proxy-ip-10-0-141-183.ec2.internal"] Apr 20 20:11:09.671542 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.671519 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.672724 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.672701 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d96qr" Apr 20 20:11:09.673687 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.673670 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qgfkv" Apr 20 20:11:09.674282 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.674260 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 20:11:09.674372 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.674302 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 20:11:09.674372 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.674360 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 20:11:09.674703 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.674682 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xg57g\"" Apr 20 20:11:09.674859 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.674844 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.675967 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.675785 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 20:11:09.675967 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.675796 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 20:11:09.675967 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.675810 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-k2xsp\"" Apr 20 20:11:09.675967 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.675937 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-x9bmk\"" Apr 20 20:11:09.676192 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.675920 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 20:11:09.676192 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.676040 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.676585 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.676568 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 20:11:09.676636 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.676617 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 20:11:09.677160 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.677143 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.677220 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.677157 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 20:11:09.677220 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.677195 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 20:11:09.677563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.677534 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 20:11:09.677657 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.677572 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 20:11:09.678304 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.678290 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.679184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.679167 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 20:11:09.679268 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.679189 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2gbkm\"" Apr 20 20:11:09.679268 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.679172 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 20:11:09.679388 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.679293 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 20:11:09.679444 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.679410 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:11:09.679495 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.679469 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qr2n8\"" Apr 20 20:11:09.679607 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.679591 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:09.679673 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:09.679656 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:11:09.680193 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.680173 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 20:11:09.680277 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.680174 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 20:11:09.680277 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.680265 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-c9c95\"" Apr 20 20:11:09.680277 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.680271 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 20:11:09.680418 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.680265 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 20:11:09.680418 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.680340 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 20:11:09.680628 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.680613 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 20:11:09.680773 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.680759 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-n495z\"" Apr 20 20:11:09.681002 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.680986 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:09.681068 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:09.681035 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hftxf" podUID="0e633d12-e3fe-490f-b2ea-097490061435" Apr 20 20:11:09.682051 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.682034 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hs9bp" Apr 20 20:11:09.683337 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.683319 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vrxxm" Apr 20 20:11:09.684444 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.684426 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 20:11:09.684538 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.684456 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:11:09.684604 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.684431 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 20:11:09.684650 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.684620 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-sn7f8\"" Apr 20 20:11:09.688366 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.685214 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 20:11:09.688366 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.686029 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 20:11:09.688366 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.686250 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-sbtll\"" Apr 20 20:11:09.688366 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.686355 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 20:11:09.689063 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.688366 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 20:11:09.691506 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691481 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/53da44a9-d18c-47a4-b3e2-f5b196e47cbe-serviceca\") pod \"node-ca-qgfkv\" (UID: \"53da44a9-d18c-47a4-b3e2-f5b196e47cbe\") " pod="openshift-image-registry/node-ca-qgfkv" Apr 20 20:11:09.691608 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691512 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-kubernetes\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.691608 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691548 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4682a90f-335e-4cef-bd3a-448c0f2a267f-cnibin\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.691608 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691576 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-cnibin\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.691608 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691598 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-cni-bin\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.691787 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691614 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x5zc\" (UniqueName: \"kubernetes.io/projected/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-kube-api-access-2x5zc\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.691787 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691631 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/803590cd-665f-48a2-83de-4637da6a6b00-tmp-dir\") pod \"node-resolver-d96qr\" (UID: \"803590cd-665f-48a2-83de-4637da6a6b00\") " pod="openshift-dns/node-resolver-d96qr" Apr 20 20:11:09.691787 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691645 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4sbj\" (UniqueName: \"kubernetes.io/projected/4682a90f-335e-4cef-bd3a-448c0f2a267f-kube-api-access-q4sbj\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.691787 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691660 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-var-lib-cni-bin\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.691787 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691700 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vz48\" (UniqueName: \"kubernetes.io/projected/a3e3e1bd-aa60-4683-a87b-474021dc8a77-kube-api-access-7vz48\") pod \"iptables-alerter-hs9bp\" (UID: \"a3e3e1bd-aa60-4683-a87b-474021dc8a77\") " pod="openshift-network-operator/iptables-alerter-hs9bp" Apr 20 20:11:09.691787 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691730 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-var-lib-openvswitch\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.691787 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691748 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.691787 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691767 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-device-dir\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.691787 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691783 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-lib-modules\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.692128 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691797 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-etc-kubernetes\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.692128 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691813 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnct6\" (UniqueName: \"kubernetes.io/projected/ba687eca-e6d2-4355-91df-eb1ca17741fe-kube-api-access-rnct6\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.692128 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691845 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-systemd-units\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.692128 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691885 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-log-socket\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.692128 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691902 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8b242afd-b68e-4166-9b12-dcd12f17eca0-agent-certs\") pod \"konnectivity-agent-vrxxm\" (UID: \"8b242afd-b68e-4166-9b12-dcd12f17eca0\") " pod="kube-system/konnectivity-agent-vrxxm" Apr 20 20:11:09.692128 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691928 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-sys\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.692128 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691964 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-hostroot\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.692128 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.691997 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-ovn-node-metrics-cert\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.692128 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692022 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8b242afd-b68e-4166-9b12-dcd12f17eca0-konnectivity-ca\") pod \"konnectivity-agent-vrxxm\" (UID: \"8b242afd-b68e-4166-9b12-dcd12f17eca0\") " pod="kube-system/konnectivity-agent-vrxxm" Apr 20 20:11:09.692128 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692050 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-sys-fs\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.692128 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692071 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psc5t\" (UniqueName: \"kubernetes.io/projected/53da44a9-d18c-47a4-b3e2-f5b196e47cbe-kube-api-access-psc5t\") pod \"node-ca-qgfkv\" (UID: \"53da44a9-d18c-47a4-b3e2-f5b196e47cbe\") " pod="openshift-image-registry/node-ca-qgfkv" Apr 20 20:11:09.692128 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692091 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-run-netns\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.692128 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692116 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-run-netns\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.692563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692134 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs\") pod \"network-metrics-daemon-z9tzr\" (UID: \"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9\") " pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:09.692563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692163 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-registration-dir\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.692563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692187 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-etc-selinux\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.692563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692209 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-systemd\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.692563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692223 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-var-lib-kubelet\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.692563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692256 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gvnc\" (UniqueName: \"kubernetes.io/projected/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-kube-api-access-6gvnc\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.692563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692278 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-sysctl-conf\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.692563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692291 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-system-cni-dir\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.692563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692311 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-kubelet\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.692563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692339 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-run-systemd\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.692563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692362 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-node-log\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.692563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692383 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-cni-netd\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.692563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692403 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4682a90f-335e-4cef-bd3a-448c0f2a267f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.692563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692419 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/803590cd-665f-48a2-83de-4637da6a6b00-hosts-file\") pod \"node-resolver-d96qr\" (UID: \"803590cd-665f-48a2-83de-4637da6a6b00\") " pod="openshift-dns/node-resolver-d96qr" Apr 20 20:11:09.692563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692434 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-var-lib-kubelet\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.692563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692449 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-run-openvswitch\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.693125 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692464 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-ovnkube-config\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.693125 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692484 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.693125 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692503 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4682a90f-335e-4cef-bd3a-448c0f2a267f-cni-binary-copy\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.693125 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692517 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-run-k8s-cni-cncf-io\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.693125 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692532 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-etc-openvswitch\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.693125 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692547 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksq2v\" (UniqueName: \"kubernetes.io/projected/ffa5dc35-4231-402c-85a6-9ae4ea55a914-kube-api-access-ksq2v\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.693125 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692562 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba687eca-e6d2-4355-91df-eb1ca17741fe-cni-binary-copy\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.693125 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692576 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-multus-socket-dir-parent\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.693125 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692591 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-var-lib-cni-multus\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.693125 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692604 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3e3e1bd-aa60-4683-a87b-474021dc8a77-host-slash\") pod \"iptables-alerter-hs9bp\" (UID: \"a3e3e1bd-aa60-4683-a87b-474021dc8a77\") " pod="openshift-network-operator/iptables-alerter-hs9bp" Apr 20 20:11:09.693125 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692617 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-sysconfig\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.693125 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692630 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-sysctl-d\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.693125 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692647 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-run-multus-certs\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.693125 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692669 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-ovnkube-script-lib\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.693125 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692684 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-modprobe-d\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.693125 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692697 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a3e3e1bd-aa60-4683-a87b-474021dc8a77-iptables-alerter-script\") pod \"iptables-alerter-hs9bp\" (UID: \"a3e3e1bd-aa60-4683-a87b-474021dc8a77\") " pod="openshift-network-operator/iptables-alerter-hs9bp" Apr 20 20:11:09.693624 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692711 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-env-overrides\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.693624 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692726 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbdrv\" (UniqueName: \"kubernetes.io/projected/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-kube-api-access-mbdrv\") pod \"network-metrics-daemon-z9tzr\" (UID: \"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9\") " pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:09.693624 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692758 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4682a90f-335e-4cef-bd3a-448c0f2a267f-system-cni-dir\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.693624 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692780 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4682a90f-335e-4cef-bd3a-448c0f2a267f-os-release\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.693624 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692796 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4682a90f-335e-4cef-bd3a-448c0f2a267f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.693624 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692822 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-multus-conf-dir\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.693624 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692842 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ba687eca-e6d2-4355-91df-eb1ca17741fe-multus-daemon-config\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.693624 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692857 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-slash\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.693624 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692871 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-run-ovn\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.693624 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692886 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53da44a9-d18c-47a4-b3e2-f5b196e47cbe-host\") pod \"node-ca-qgfkv\" (UID: \"53da44a9-d18c-47a4-b3e2-f5b196e47cbe\") " pod="openshift-image-registry/node-ca-qgfkv" Apr 20 20:11:09.693624 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692928 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-host\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.693624 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692960 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4682a90f-335e-4cef-bd3a-448c0f2a267f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.693624 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.692986 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-multus-cni-dir\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.693624 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.693002 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-os-release\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.693624 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.693018 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5z56\" (UniqueName: \"kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56\") pod \"network-check-target-hftxf\" (UID: \"0e633d12-e3fe-490f-b2ea-097490061435\") " pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:09.693624 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.693040 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-socket-dir\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.694086 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.693079 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnrw7\" (UniqueName: \"kubernetes.io/projected/803590cd-665f-48a2-83de-4637da6a6b00-kube-api-access-pnrw7\") pod \"node-resolver-d96qr\" (UID: \"803590cd-665f-48a2-83de-4637da6a6b00\") " pod="openshift-dns/node-resolver-d96qr" Apr 20 20:11:09.694086 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.693118 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-run\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.694086 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.693144 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-tuned\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.694086 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.693168 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ffa5dc35-4231-402c-85a6-9ae4ea55a914-tmp\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.694086 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.693196 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.699910 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.699886 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:11:09.720191 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.720148 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-fwbn9" Apr 20 20:11:09.732035 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.732008 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-fwbn9" Apr 20 20:11:09.793503 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793468 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-socket-dir\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.793503 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793512 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnrw7\" (UniqueName: \"kubernetes.io/projected/803590cd-665f-48a2-83de-4637da6a6b00-kube-api-access-pnrw7\") pod \"node-resolver-d96qr\" (UID: \"803590cd-665f-48a2-83de-4637da6a6b00\") " pod="openshift-dns/node-resolver-d96qr" Apr 20 20:11:09.793742 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793536 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-run\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.793742 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793558 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-tuned\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.793742 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793577 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ffa5dc35-4231-402c-85a6-9ae4ea55a914-tmp\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.793742 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793598 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.793742 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793643 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.793742 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793679 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/53da44a9-d18c-47a4-b3e2-f5b196e47cbe-serviceca\") pod \"node-ca-qgfkv\" (UID: \"53da44a9-d18c-47a4-b3e2-f5b196e47cbe\") " pod="openshift-image-registry/node-ca-qgfkv" Apr 20 20:11:09.793742 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793718 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-kubernetes\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.793742 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793729 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-run\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.793742 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793742 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4682a90f-335e-4cef-bd3a-448c0f2a267f-cnibin\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.794184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793786 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4682a90f-335e-4cef-bd3a-448c0f2a267f-cnibin\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.794184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793788 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-kubernetes\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.794184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793804 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-cnibin\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.794184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793824 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-cni-bin\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.794184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793849 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2x5zc\" (UniqueName: \"kubernetes.io/projected/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-kube-api-access-2x5zc\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.794184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793874 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/803590cd-665f-48a2-83de-4637da6a6b00-tmp-dir\") pod \"node-resolver-d96qr\" (UID: \"803590cd-665f-48a2-83de-4637da6a6b00\") " pod="openshift-dns/node-resolver-d96qr" Apr 20 20:11:09.794184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793888 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-cnibin\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.794184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793898 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4sbj\" (UniqueName: \"kubernetes.io/projected/4682a90f-335e-4cef-bd3a-448c0f2a267f-kube-api-access-q4sbj\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.794184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793895 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-cni-bin\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.794184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793925 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-var-lib-cni-bin\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.794184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793969 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vz48\" (UniqueName: \"kubernetes.io/projected/a3e3e1bd-aa60-4683-a87b-474021dc8a77-kube-api-access-7vz48\") pod \"iptables-alerter-hs9bp\" (UID: \"a3e3e1bd-aa60-4683-a87b-474021dc8a77\") " pod="openshift-network-operator/iptables-alerter-hs9bp" Apr 20 20:11:09.794184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.793995 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-var-lib-openvswitch\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.794184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794004 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-var-lib-cni-bin\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.794184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794022 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.794184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794046 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-device-dir\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.794184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794080 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-var-lib-openvswitch\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.794184 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794091 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-lib-modules\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794130 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/53da44a9-d18c-47a4-b3e2-f5b196e47cbe-serviceca\") pod \"node-ca-qgfkv\" (UID: \"53da44a9-d18c-47a4-b3e2-f5b196e47cbe\") " pod="openshift-image-registry/node-ca-qgfkv" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794134 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-etc-kubernetes\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794182 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnct6\" (UniqueName: \"kubernetes.io/projected/ba687eca-e6d2-4355-91df-eb1ca17741fe-kube-api-access-rnct6\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794214 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-device-dir\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794217 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-systemd-units\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794216 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-lib-modules\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794229 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/803590cd-665f-48a2-83de-4637da6a6b00-tmp-dir\") pod \"node-resolver-d96qr\" (UID: \"803590cd-665f-48a2-83de-4637da6a6b00\") " pod="openshift-dns/node-resolver-d96qr" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794247 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-log-socket\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794251 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-etc-kubernetes\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794259 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-systemd-units\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794195 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794303 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-log-socket\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794294 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8b242afd-b68e-4166-9b12-dcd12f17eca0-agent-certs\") pod \"konnectivity-agent-vrxxm\" (UID: \"8b242afd-b68e-4166-9b12-dcd12f17eca0\") " pod="kube-system/konnectivity-agent-vrxxm" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794342 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-sys\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794365 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-hostroot\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794387 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-ovn-node-metrics-cert\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794405 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8b242afd-b68e-4166-9b12-dcd12f17eca0-konnectivity-ca\") pod \"konnectivity-agent-vrxxm\" (UID: \"8b242afd-b68e-4166-9b12-dcd12f17eca0\") " pod="kube-system/konnectivity-agent-vrxxm" Apr 20 20:11:09.794981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794424 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-sys\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.795809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794440 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-hostroot\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.795809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794434 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-sys-fs\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.795809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794485 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-sys-fs\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.795809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794505 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psc5t\" (UniqueName: \"kubernetes.io/projected/53da44a9-d18c-47a4-b3e2-f5b196e47cbe-kube-api-access-psc5t\") pod \"node-ca-qgfkv\" (UID: \"53da44a9-d18c-47a4-b3e2-f5b196e47cbe\") " pod="openshift-image-registry/node-ca-qgfkv" Apr 20 20:11:09.795809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794530 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-run-netns\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.795809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794553 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-run-netns\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.795809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794577 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs\") pod \"network-metrics-daemon-z9tzr\" (UID: \"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9\") " pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:09.795809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794594 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-run-netns\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.795809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794602 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-registration-dir\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.795809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794625 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-etc-selinux\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.795809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794635 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-run-netns\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.795809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794646 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-systemd\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.795809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794668 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-var-lib-kubelet\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.795809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794695 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gvnc\" (UniqueName: \"kubernetes.io/projected/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-kube-api-access-6gvnc\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.795809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794701 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-etc-selinux\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.795809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794701 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-registration-dir\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.795809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794719 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-sysctl-conf\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794756 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-system-cni-dir\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794778 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-var-lib-kubelet\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:09.794778 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794798 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-system-cni-dir\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794780 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-systemd\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:09.794875 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs podName:ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:10.294842471 +0000 UTC m=+2.050212983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs") pod "network-metrics-daemon-z9tzr" (UID: "ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794895 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-kubelet\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794915 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-sysctl-conf\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794923 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-run-systemd\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794941 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794942 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-socket-dir\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794979 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-run-systemd\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.794982 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-node-log\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795012 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-kubelet\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795026 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-cni-netd\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795014 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-node-log\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795053 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4682a90f-335e-4cef-bd3a-448c0f2a267f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.796607 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795058 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-cni-netd\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.797490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795092 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8b242afd-b68e-4166-9b12-dcd12f17eca0-konnectivity-ca\") pod \"konnectivity-agent-vrxxm\" (UID: \"8b242afd-b68e-4166-9b12-dcd12f17eca0\") " pod="kube-system/konnectivity-agent-vrxxm" Apr 20 20:11:09.797490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795104 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/803590cd-665f-48a2-83de-4637da6a6b00-hosts-file\") pod \"node-resolver-d96qr\" (UID: \"803590cd-665f-48a2-83de-4637da6a6b00\") " pod="openshift-dns/node-resolver-d96qr" Apr 20 20:11:09.797490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795154 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-var-lib-kubelet\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.797490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795180 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-run-openvswitch\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.797490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795187 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4682a90f-335e-4cef-bd3a-448c0f2a267f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.797490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795206 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-ovnkube-config\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.797490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795216 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/803590cd-665f-48a2-83de-4637da6a6b00-hosts-file\") pod \"node-resolver-d96qr\" (UID: \"803590cd-665f-48a2-83de-4637da6a6b00\") " pod="openshift-dns/node-resolver-d96qr" Apr 20 20:11:09.797490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795230 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.797490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795256 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4682a90f-335e-4cef-bd3a-448c0f2a267f-cni-binary-copy\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.797490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795260 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-run-openvswitch\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.797490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795295 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-run-k8s-cni-cncf-io\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.797490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795314 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.797490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795321 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-etc-openvswitch\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.797490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795231 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-var-lib-kubelet\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.797490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795374 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksq2v\" (UniqueName: \"kubernetes.io/projected/ffa5dc35-4231-402c-85a6-9ae4ea55a914-kube-api-access-ksq2v\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.797490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795398 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba687eca-e6d2-4355-91df-eb1ca17741fe-cni-binary-copy\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.797490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795421 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-multus-socket-dir-parent\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.798263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795446 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-var-lib-cni-multus\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.798263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795483 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3e3e1bd-aa60-4683-a87b-474021dc8a77-host-slash\") pod \"iptables-alerter-hs9bp\" (UID: \"a3e3e1bd-aa60-4683-a87b-474021dc8a77\") " pod="openshift-network-operator/iptables-alerter-hs9bp" Apr 20 20:11:09.798263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795508 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-sysconfig\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.798263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795554 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-sysctl-d\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.798263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795628 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-run-multus-certs\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.798263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795788 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-ovnkube-script-lib\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.798263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795811 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-modprobe-d\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.798263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795835 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a3e3e1bd-aa60-4683-a87b-474021dc8a77-iptables-alerter-script\") pod \"iptables-alerter-hs9bp\" (UID: \"a3e3e1bd-aa60-4683-a87b-474021dc8a77\") " pod="openshift-network-operator/iptables-alerter-hs9bp" Apr 20 20:11:09.798263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795839 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4682a90f-335e-4cef-bd3a-448c0f2a267f-cni-binary-copy\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.798263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795856 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-env-overrides\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.798263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795895 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbdrv\" (UniqueName: \"kubernetes.io/projected/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-kube-api-access-mbdrv\") pod \"network-metrics-daemon-z9tzr\" (UID: \"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9\") " pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:09.798263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795910 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-ovnkube-config\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.798263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795922 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4682a90f-335e-4cef-bd3a-448c0f2a267f-system-cni-dir\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.798263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795978 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-etc-openvswitch\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.798263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795989 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4682a90f-335e-4cef-bd3a-448c0f2a267f-os-release\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.798263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.796036 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4682a90f-335e-4cef-bd3a-448c0f2a267f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.798263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.796069 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-multus-conf-dir\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.796098 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ba687eca-e6d2-4355-91df-eb1ca17741fe-multus-daemon-config\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.796123 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-slash\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.796162 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-run-ovn\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.796186 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53da44a9-d18c-47a4-b3e2-f5b196e47cbe-host\") pod \"node-ca-qgfkv\" (UID: \"53da44a9-d18c-47a4-b3e2-f5b196e47cbe\") " pod="openshift-image-registry/node-ca-qgfkv" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.796226 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-host\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.796259 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4682a90f-335e-4cef-bd3a-448c0f2a267f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.796309 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-multus-cni-dir\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.796317 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-multus-socket-dir-parent\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.796351 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-os-release\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.796376 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5z56\" (UniqueName: \"kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56\") pod \"network-check-target-hftxf\" (UID: \"0e633d12-e3fe-490f-b2ea-097490061435\") " pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.796662 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba687eca-e6d2-4355-91df-eb1ca17741fe-cni-binary-copy\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.796668 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-sysconfig\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.796736 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-var-lib-cni-multus\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.796768 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-sysctl-d\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.796814 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-run-multus-certs\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.797352 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-ovnkube-script-lib\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.797425 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-host\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.798849 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.797426 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-multus-conf-dir\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.797415 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4682a90f-335e-4cef-bd3a-448c0f2a267f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.797462 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53da44a9-d18c-47a4-b3e2-f5b196e47cbe-host\") pod \"node-ca-qgfkv\" (UID: \"53da44a9-d18c-47a4-b3e2-f5b196e47cbe\") " pod="openshift-image-registry/node-ca-qgfkv" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.797542 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-modprobe-d\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.797923 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ba687eca-e6d2-4355-91df-eb1ca17741fe-multus-daemon-config\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.798003 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-host-slash\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.798048 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-run-ovn\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.798106 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-multus-cni-dir\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.798132 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ffa5dc35-4231-402c-85a6-9ae4ea55a914-etc-tuned\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.798147 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4682a90f-335e-4cef-bd3a-448c0f2a267f-system-cni-dir\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795923 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3e3e1bd-aa60-4683-a87b-474021dc8a77-host-slash\") pod \"iptables-alerter-hs9bp\" (UID: \"a3e3e1bd-aa60-4683-a87b-474021dc8a77\") " pod="openshift-network-operator/iptables-alerter-hs9bp" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.798197 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a3e3e1bd-aa60-4683-a87b-474021dc8a77-iptables-alerter-script\") pod \"iptables-alerter-hs9bp\" (UID: \"a3e3e1bd-aa60-4683-a87b-474021dc8a77\") " pod="openshift-network-operator/iptables-alerter-hs9bp" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.798200 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ffa5dc35-4231-402c-85a6-9ae4ea55a914-tmp\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.795987 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-host-run-k8s-cni-cncf-io\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.798239 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba687eca-e6d2-4355-91df-eb1ca17741fe-os-release\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.798288 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4682a90f-335e-4cef-bd3a-448c0f2a267f-os-release\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.798360 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-ovn-node-metrics-cert\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.798494 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-env-overrides\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.799350 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.798510 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8b242afd-b68e-4166-9b12-dcd12f17eca0-agent-certs\") pod \"konnectivity-agent-vrxxm\" (UID: \"8b242afd-b68e-4166-9b12-dcd12f17eca0\") " pod="kube-system/konnectivity-agent-vrxxm" Apr 20 20:11:09.799830 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.798534 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4682a90f-335e-4cef-bd3a-448c0f2a267f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.803353 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:09.803316 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:09.803353 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:09.803340 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:09.803353 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:09.803352 2580 projected.go:194] Error preparing data for projected volume kube-api-access-r5z56 for pod openshift-network-diagnostics/network-check-target-hftxf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:09.803574 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:09.803443 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56 podName:0e633d12-e3fe-490f-b2ea-097490061435 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:10.303424148 +0000 UTC m=+2.058794670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-r5z56" (UniqueName: "kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56") pod "network-check-target-hftxf" (UID: "0e633d12-e3fe-490f-b2ea-097490061435") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:09.804485 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.804411 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnrw7\" (UniqueName: \"kubernetes.io/projected/803590cd-665f-48a2-83de-4637da6a6b00-kube-api-access-pnrw7\") pod \"node-resolver-d96qr\" (UID: \"803590cd-665f-48a2-83de-4637da6a6b00\") " pod="openshift-dns/node-resolver-d96qr" Apr 20 20:11:09.805899 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.805870 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vz48\" (UniqueName: \"kubernetes.io/projected/a3e3e1bd-aa60-4683-a87b-474021dc8a77-kube-api-access-7vz48\") pod \"iptables-alerter-hs9bp\" (UID: \"a3e3e1bd-aa60-4683-a87b-474021dc8a77\") " pod="openshift-network-operator/iptables-alerter-hs9bp" Apr 20 20:11:09.806330 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.806293 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x5zc\" (UniqueName: \"kubernetes.io/projected/815d5998-2dfe-4a37-8d71-80fd71ad5a3f-kube-api-access-2x5zc\") pod \"aws-ebs-csi-driver-node-qqsvt\" (UID: \"815d5998-2dfe-4a37-8d71-80fd71ad5a3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:09.806708 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.806678 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksq2v\" (UniqueName: \"kubernetes.io/projected/ffa5dc35-4231-402c-85a6-9ae4ea55a914-kube-api-access-ksq2v\") pod \"tuned-qf9zb\" (UID: \"ffa5dc35-4231-402c-85a6-9ae4ea55a914\") " pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:09.807114 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.807094 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psc5t\" (UniqueName: \"kubernetes.io/projected/53da44a9-d18c-47a4-b3e2-f5b196e47cbe-kube-api-access-psc5t\") pod \"node-ca-qgfkv\" (UID: \"53da44a9-d18c-47a4-b3e2-f5b196e47cbe\") " pod="openshift-image-registry/node-ca-qgfkv" Apr 20 20:11:09.807216 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.807150 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnct6\" (UniqueName: \"kubernetes.io/projected/ba687eca-e6d2-4355-91df-eb1ca17741fe-kube-api-access-rnct6\") pod \"multus-jfpm5\" (UID: \"ba687eca-e6d2-4355-91df-eb1ca17741fe\") " pod="openshift-multus/multus-jfpm5" Apr 20 20:11:09.807216 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.807189 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4sbj\" (UniqueName: \"kubernetes.io/projected/4682a90f-335e-4cef-bd3a-448c0f2a267f-kube-api-access-q4sbj\") pod \"multus-additional-cni-plugins-mql6h\" (UID: \"4682a90f-335e-4cef-bd3a-448c0f2a267f\") " pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:09.807355 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.807336 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gvnc\" (UniqueName: \"kubernetes.io/projected/2bc1e339-b5d7-4ff2-81cc-110408fe4e5f-kube-api-access-6gvnc\") pod \"ovnkube-node-k9hml\" (UID: \"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:09.809484 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.809466 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbdrv\" (UniqueName: \"kubernetes.io/projected/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-kube-api-access-mbdrv\") pod \"network-metrics-daemon-z9tzr\" (UID: \"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9\") " pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:09.838412 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.838372 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hs9bp" Apr 20 20:11:09.843837 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.843819 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vrxxm" Apr 20 20:11:09.988058 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:09.988018 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1816a504f93ff56ac217cacae748395.slice/crio-e3ec82a25b2b8044dba5f68398605ce6e52c5c238194654ace51035fdc3ac1fb WatchSource:0}: Error finding container e3ec82a25b2b8044dba5f68398605ce6e52c5c238194654ace51035fdc3ac1fb: Status 404 returned error can't find the container with id e3ec82a25b2b8044dba5f68398605ce6e52c5c238194654ace51035fdc3ac1fb Apr 20 20:11:09.988256 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:09.988236 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01a299fcf64ba455a08a0abdcaf7bcdd.slice/crio-934682632f0a51d311f3ba6be60973ec2b755c3e5269ed14d1a25766509780c5 WatchSource:0}: Error finding container 934682632f0a51d311f3ba6be60973ec2b755c3e5269ed14d1a25766509780c5: Status 404 returned error can't find the container with id 934682632f0a51d311f3ba6be60973ec2b755c3e5269ed14d1a25766509780c5 Apr 20 20:11:09.993450 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:09.993432 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:11:10.002421 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.002394 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" Apr 20 20:11:10.007261 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.007233 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d96qr" Apr 20 20:11:10.009204 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:10.009176 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod815d5998_2dfe_4a37_8d71_80fd71ad5a3f.slice/crio-d5c01c3cda2a025e801fba98e6a3586d7f6816b679ce1a3bc8bff712aac14c29 WatchSource:0}: Error finding container d5c01c3cda2a025e801fba98e6a3586d7f6816b679ce1a3bc8bff712aac14c29: Status 404 returned error can't find the container with id d5c01c3cda2a025e801fba98e6a3586d7f6816b679ce1a3bc8bff712aac14c29 Apr 20 20:11:10.014293 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:10.014266 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod803590cd_665f_48a2_83de_4637da6a6b00.slice/crio-5304195ef56c44da1afcb454eeb1c942a724c2f6c3abbc6377d43e9ce27dc235 WatchSource:0}: Error finding container 5304195ef56c44da1afcb454eeb1c942a724c2f6c3abbc6377d43e9ce27dc235: Status 404 returned error can't find the container with id 5304195ef56c44da1afcb454eeb1c942a724c2f6c3abbc6377d43e9ce27dc235 Apr 20 20:11:10.021614 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.021589 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qgfkv" Apr 20 20:11:10.027989 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:10.027944 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53da44a9_d18c_47a4_b3e2_f5b196e47cbe.slice/crio-81efcee3ba5af74a704e92a7a8b3c3d5b1d77eeeaf1f7894291fd25f82df36c2 WatchSource:0}: Error finding container 81efcee3ba5af74a704e92a7a8b3c3d5b1d77eeeaf1f7894291fd25f82df36c2: Status 404 returned error can't find the container with id 81efcee3ba5af74a704e92a7a8b3c3d5b1d77eeeaf1f7894291fd25f82df36c2 Apr 20 20:11:10.045412 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.045380 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:10.052461 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:10.052279 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bc1e339_b5d7_4ff2_81cc_110408fe4e5f.slice/crio-e6aa920f1affa878df264e510bfa842d5a7891326514ea8fdd6a9086a10cd222 WatchSource:0}: Error finding container e6aa920f1affa878df264e510bfa842d5a7891326514ea8fdd6a9086a10cd222: Status 404 returned error can't find the container with id e6aa920f1affa878df264e510bfa842d5a7891326514ea8fdd6a9086a10cd222 Apr 20 20:11:10.053563 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.053549 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" Apr 20 20:11:10.060285 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:10.060254 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffa5dc35_4231_402c_85a6_9ae4ea55a914.slice/crio-983a91d81f38e82b1be309d8a0aec0ca0d5560747667634cc62fa815da8daed2 WatchSource:0}: Error finding container 983a91d81f38e82b1be309d8a0aec0ca0d5560747667634cc62fa815da8daed2: Status 404 returned error can't find the container with id 983a91d81f38e82b1be309d8a0aec0ca0d5560747667634cc62fa815da8daed2 Apr 20 20:11:10.082662 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.082621 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mql6h" Apr 20 20:11:10.089140 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:10.089109 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4682a90f_335e_4cef_bd3a_448c0f2a267f.slice/crio-e7b753950a36b9c40545ac55aaec0fc5a19a4b5b17adcc7c4a1b07deffe21716 WatchSource:0}: Error finding container e7b753950a36b9c40545ac55aaec0fc5a19a4b5b17adcc7c4a1b07deffe21716: Status 404 returned error can't find the container with id e7b753950a36b9c40545ac55aaec0fc5a19a4b5b17adcc7c4a1b07deffe21716 Apr 20 20:11:10.106921 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.106887 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jfpm5" Apr 20 20:11:10.113558 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:10.113520 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba687eca_e6d2_4355_91df_eb1ca17741fe.slice/crio-65aaeb432fe22b1b5a2a763ddcfe53a6f1919006f275715acf4475a294a84a9c WatchSource:0}: Error finding container 65aaeb432fe22b1b5a2a763ddcfe53a6f1919006f275715acf4475a294a84a9c: Status 404 returned error can't find the container with id 65aaeb432fe22b1b5a2a763ddcfe53a6f1919006f275715acf4475a294a84a9c Apr 20 20:11:10.192933 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:10.192904 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3e3e1bd_aa60_4683_a87b_474021dc8a77.slice/crio-5554212d42ee47a5fe05032995824a05be30c82ea01d8848e3db5d2d2b4a3dd4 WatchSource:0}: Error finding container 5554212d42ee47a5fe05032995824a05be30c82ea01d8848e3db5d2d2b4a3dd4: Status 404 returned error can't find the container with id 5554212d42ee47a5fe05032995824a05be30c82ea01d8848e3db5d2d2b4a3dd4 Apr 20 20:11:10.193340 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:10.193318 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b242afd_b68e_4166_9b12_dcd12f17eca0.slice/crio-aa14648cc7bf928f9858bceed82e21a9514fc9ffb4a7e1ea1d74c2c4805f4052 WatchSource:0}: Error finding container aa14648cc7bf928f9858bceed82e21a9514fc9ffb4a7e1ea1d74c2c4805f4052: Status 404 returned error can't find the container with id aa14648cc7bf928f9858bceed82e21a9514fc9ffb4a7e1ea1d74c2c4805f4052 Apr 20 20:11:10.299699 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.299658 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs\") pod \"network-metrics-daemon-z9tzr\" (UID: \"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9\") " pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:10.299979 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:10.299812 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:10.299979 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:10.299878 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs podName:ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:11.299862409 +0000 UTC m=+3.055232917 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs") pod "network-metrics-daemon-z9tzr" (UID: "ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:10.400204 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.400110 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5z56\" (UniqueName: \"kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56\") pod \"network-check-target-hftxf\" (UID: \"0e633d12-e3fe-490f-b2ea-097490061435\") " pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:10.400376 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:10.400257 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:10.400376 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:10.400273 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:10.400376 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:10.400287 2580 projected.go:194] Error preparing data for projected volume kube-api-access-r5z56 for pod openshift-network-diagnostics/network-check-target-hftxf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:10.400376 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:10.400352 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56 podName:0e633d12-e3fe-490f-b2ea-097490061435 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:11.400334048 +0000 UTC m=+3.155704557 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-r5z56" (UniqueName: "kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56") pod "network-check-target-hftxf" (UID: "0e633d12-e3fe-490f-b2ea-097490061435") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:10.734544 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.734393 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:06:09 +0000 UTC" deadline="2028-02-04 00:22:15.97541558 +0000 UTC" Apr 20 20:11:10.734544 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.734435 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15700h11m5.240983466s" Apr 20 20:11:10.814804 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.814580 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:10.877817 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.877750 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vrxxm" event={"ID":"8b242afd-b68e-4166-9b12-dcd12f17eca0","Type":"ContainerStarted","Data":"aa14648cc7bf928f9858bceed82e21a9514fc9ffb4a7e1ea1d74c2c4805f4052"} Apr 20 20:11:10.883374 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.883333 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hs9bp" event={"ID":"a3e3e1bd-aa60-4683-a87b-474021dc8a77","Type":"ContainerStarted","Data":"5554212d42ee47a5fe05032995824a05be30c82ea01d8848e3db5d2d2b4a3dd4"} Apr 20 20:11:10.885533 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.885492 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mql6h" event={"ID":"4682a90f-335e-4cef-bd3a-448c0f2a267f","Type":"ContainerStarted","Data":"e7b753950a36b9c40545ac55aaec0fc5a19a4b5b17adcc7c4a1b07deffe21716"} Apr 20 20:11:10.891151 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.891113 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" event={"ID":"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f","Type":"ContainerStarted","Data":"e6aa920f1affa878df264e510bfa842d5a7891326514ea8fdd6a9086a10cd222"} Apr 20 20:11:10.896660 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.896624 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qgfkv" event={"ID":"53da44a9-d18c-47a4-b3e2-f5b196e47cbe","Type":"ContainerStarted","Data":"81efcee3ba5af74a704e92a7a8b3c3d5b1d77eeeaf1f7894291fd25f82df36c2"} Apr 20 20:11:10.900512 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.900476 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d96qr" event={"ID":"803590cd-665f-48a2-83de-4637da6a6b00","Type":"ContainerStarted","Data":"5304195ef56c44da1afcb454eeb1c942a724c2f6c3abbc6377d43e9ce27dc235"} Apr 20 20:11:10.902680 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.902636 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" event={"ID":"815d5998-2dfe-4a37-8d71-80fd71ad5a3f","Type":"ContainerStarted","Data":"d5c01c3cda2a025e801fba98e6a3586d7f6816b679ce1a3bc8bff712aac14c29"} Apr 20 20:11:10.905758 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.905687 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jfpm5" event={"ID":"ba687eca-e6d2-4355-91df-eb1ca17741fe","Type":"ContainerStarted","Data":"65aaeb432fe22b1b5a2a763ddcfe53a6f1919006f275715acf4475a294a84a9c"} Apr 20 20:11:10.918251 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.918215 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" event={"ID":"ffa5dc35-4231-402c-85a6-9ae4ea55a914","Type":"ContainerStarted","Data":"983a91d81f38e82b1be309d8a0aec0ca0d5560747667634cc62fa815da8daed2"} Apr 20 20:11:10.931043 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.930994 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-183.ec2.internal" event={"ID":"01a299fcf64ba455a08a0abdcaf7bcdd","Type":"ContainerStarted","Data":"934682632f0a51d311f3ba6be60973ec2b755c3e5269ed14d1a25766509780c5"} Apr 20 20:11:10.932977 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.932926 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:10.944764 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.944726 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal" event={"ID":"e1816a504f93ff56ac217cacae748395","Type":"ContainerStarted","Data":"e3ec82a25b2b8044dba5f68398605ce6e52c5c238194654ace51035fdc3ac1fb"} Apr 20 20:11:10.991787 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:10.991398 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:11.307222 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:11.307135 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs\") pod \"network-metrics-daemon-z9tzr\" (UID: \"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9\") " pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:11.307400 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:11.307373 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:11.307468 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:11.307439 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs podName:ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:13.307421609 +0000 UTC m=+5.062792120 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs") pod "network-metrics-daemon-z9tzr" (UID: "ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:11.408044 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:11.407984 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5z56\" (UniqueName: \"kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56\") pod \"network-check-target-hftxf\" (UID: \"0e633d12-e3fe-490f-b2ea-097490061435\") " pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:11.408224 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:11.408156 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:11.408224 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:11.408178 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:11.408224 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:11.408191 2580 projected.go:194] Error preparing data for projected volume kube-api-access-r5z56 for pod openshift-network-diagnostics/network-check-target-hftxf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:11.408357 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:11.408251 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56 podName:0e633d12-e3fe-490f-b2ea-097490061435 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:13.408232368 +0000 UTC m=+5.163602882 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-r5z56" (UniqueName: "kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56") pod "network-check-target-hftxf" (UID: "0e633d12-e3fe-490f-b2ea-097490061435") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:11.736936 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:11.736831 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:06:09 +0000 UTC" deadline="2027-11-30 00:52:26.674249287 +0000 UTC" Apr 20 20:11:11.736936 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:11.736869 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14116h41m14.937384276s" Apr 20 20:11:11.820524 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:11.820486 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:11.820720 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:11.820624 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hftxf" podUID="0e633d12-e3fe-490f-b2ea-097490061435" Apr 20 20:11:11.821111 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:11.821090 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:11.821219 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:11.821198 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:11:13.326129 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:13.326049 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs\") pod \"network-metrics-daemon-z9tzr\" (UID: \"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9\") " pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:13.326743 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:13.326217 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:13.326743 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:13.326279 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs podName:ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:17.326261111 +0000 UTC m=+9.081631634 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs") pod "network-metrics-daemon-z9tzr" (UID: "ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:13.426468 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:13.426423 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5z56\" (UniqueName: \"kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56\") pod \"network-check-target-hftxf\" (UID: \"0e633d12-e3fe-490f-b2ea-097490061435\") " pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:13.426704 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:13.426644 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:13.426704 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:13.426665 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:13.426704 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:13.426680 2580 projected.go:194] Error preparing data for projected volume kube-api-access-r5z56 for pod openshift-network-diagnostics/network-check-target-hftxf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:13.426876 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:13.426746 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56 podName:0e633d12-e3fe-490f-b2ea-097490061435 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:17.426719495 +0000 UTC m=+9.182090019 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-r5z56" (UniqueName: "kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56") pod "network-check-target-hftxf" (UID: "0e633d12-e3fe-490f-b2ea-097490061435") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:13.820556 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:13.820463 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:13.820807 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:13.820592 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:11:13.820941 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:13.820926 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:13.821041 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:13.821010 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hftxf" podUID="0e633d12-e3fe-490f-b2ea-097490061435" Apr 20 20:11:15.820009 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:15.819975 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:15.820418 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:15.820103 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hftxf" podUID="0e633d12-e3fe-490f-b2ea-097490061435" Apr 20 20:11:15.820514 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:15.820484 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:15.820663 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:15.820593 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:11:17.360356 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:17.360316 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs\") pod \"network-metrics-daemon-z9tzr\" (UID: \"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9\") " pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:17.360924 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:17.360493 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:17.360924 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:17.360574 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs podName:ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:25.360553667 +0000 UTC m=+17.115924190 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs") pod "network-metrics-daemon-z9tzr" (UID: "ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:17.461024 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:17.460975 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5z56\" (UniqueName: \"kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56\") pod \"network-check-target-hftxf\" (UID: \"0e633d12-e3fe-490f-b2ea-097490061435\") " pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:17.461189 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:17.461165 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:17.461189 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:17.461187 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:17.461280 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:17.461201 2580 projected.go:194] Error preparing data for projected volume kube-api-access-r5z56 for pod openshift-network-diagnostics/network-check-target-hftxf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:17.461280 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:17.461266 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56 podName:0e633d12-e3fe-490f-b2ea-097490061435 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:25.461245258 +0000 UTC m=+17.216615789 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-r5z56" (UniqueName: "kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56") pod "network-check-target-hftxf" (UID: "0e633d12-e3fe-490f-b2ea-097490061435") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:17.820407 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:17.820370 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:17.820603 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:17.820501 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hftxf" podUID="0e633d12-e3fe-490f-b2ea-097490061435" Apr 20 20:11:17.821118 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:17.820972 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:17.821118 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:17.821078 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:11:19.820541 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:19.820492 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:19.820541 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:19.820532 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:19.821057 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:19.820628 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hftxf" podUID="0e633d12-e3fe-490f-b2ea-097490061435" Apr 20 20:11:19.821057 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:19.820737 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:11:21.820510 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:21.820466 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:21.820978 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:21.820466 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:21.820978 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:21.820622 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:11:21.820978 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:21.820664 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hftxf" podUID="0e633d12-e3fe-490f-b2ea-097490061435" Apr 20 20:11:23.820456 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:23.820415 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:23.820456 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:23.820447 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:23.820931 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:23.820555 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:11:23.820931 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:23.820701 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hftxf" podUID="0e633d12-e3fe-490f-b2ea-097490061435" Apr 20 20:11:24.936339 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:24.936304 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-d2467"] Apr 20 20:11:24.974297 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:24.974053 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:24.974297 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:24.974144 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d2467" podUID="3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed" Apr 20 20:11:25.016820 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:25.016774 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-kubelet-config\") pod \"global-pull-secret-syncer-d2467\" (UID: \"3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed\") " pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:25.017007 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:25.016899 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-dbus\") pod \"global-pull-secret-syncer-d2467\" (UID: \"3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed\") " pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:25.017007 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:25.016934 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret\") pod \"global-pull-secret-syncer-d2467\" (UID: \"3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed\") " pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:25.117568 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:25.117503 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-kubelet-config\") pod \"global-pull-secret-syncer-d2467\" (UID: \"3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed\") " pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:25.117721 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:25.117638 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-dbus\") pod \"global-pull-secret-syncer-d2467\" (UID: \"3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed\") " pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:25.117721 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:25.117664 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-kubelet-config\") pod \"global-pull-secret-syncer-d2467\" (UID: \"3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed\") " pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:25.117721 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:25.117670 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret\") pod \"global-pull-secret-syncer-d2467\" (UID: \"3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed\") " pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:25.117853 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:25.117743 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:25.117853 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:25.117804 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret podName:3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed nodeName:}" failed. No retries permitted until 2026-04-20 20:11:25.617784916 +0000 UTC m=+17.373155425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret") pod "global-pull-secret-syncer-d2467" (UID: "3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:25.117853 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:25.117840 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-dbus\") pod \"global-pull-secret-syncer-d2467\" (UID: \"3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed\") " pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:25.420215 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:25.420183 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs\") pod \"network-metrics-daemon-z9tzr\" (UID: \"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9\") " pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:25.420399 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:25.420310 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:25.420399 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:25.420378 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs podName:ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:41.420359695 +0000 UTC m=+33.175730203 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs") pod "network-metrics-daemon-z9tzr" (UID: "ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:25.520785 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:25.520744 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5z56\" (UniqueName: \"kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56\") pod \"network-check-target-hftxf\" (UID: \"0e633d12-e3fe-490f-b2ea-097490061435\") " pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:25.521022 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:25.520929 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:25.521022 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:25.520972 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:25.521022 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:25.520989 2580 projected.go:194] Error preparing data for projected volume kube-api-access-r5z56 for pod openshift-network-diagnostics/network-check-target-hftxf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:25.521165 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:25.521052 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56 podName:0e633d12-e3fe-490f-b2ea-097490061435 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:41.521031221 +0000 UTC m=+33.276401733 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-r5z56" (UniqueName: "kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56") pod "network-check-target-hftxf" (UID: "0e633d12-e3fe-490f-b2ea-097490061435") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:25.621311 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:25.621257 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret\") pod \"global-pull-secret-syncer-d2467\" (UID: \"3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed\") " pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:25.621473 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:25.621406 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:25.621473 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:25.621468 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret podName:3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed nodeName:}" failed. No retries permitted until 2026-04-20 20:11:26.621451917 +0000 UTC m=+18.376822424 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret") pod "global-pull-secret-syncer-d2467" (UID: "3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:25.820738 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:25.820701 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:25.820928 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:25.820699 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:25.820928 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:25.820855 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:11:25.821078 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:25.820923 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hftxf" podUID="0e633d12-e3fe-490f-b2ea-097490061435" Apr 20 20:11:26.628363 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:26.628330 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret\") pod \"global-pull-secret-syncer-d2467\" (UID: \"3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed\") " pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:26.628770 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:26.628475 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:26.628770 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:26.628546 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret podName:3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed nodeName:}" failed. No retries permitted until 2026-04-20 20:11:28.628527011 +0000 UTC m=+20.383897523 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret") pod "global-pull-secret-syncer-d2467" (UID: "3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:26.820037 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:26.819996 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:26.820218 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:26.820129 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d2467" podUID="3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed" Apr 20 20:11:27.820002 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:27.819961 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:27.820385 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:27.820017 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:27.820385 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:27.820096 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:11:27.820385 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:27.820154 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hftxf" podUID="0e633d12-e3fe-490f-b2ea-097490061435" Apr 20 20:11:28.641212 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:28.641184 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret\") pod \"global-pull-secret-syncer-d2467\" (UID: \"3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed\") " pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:28.641359 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:28.641295 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:28.641359 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:28.641353 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret podName:3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed nodeName:}" failed. No retries permitted until 2026-04-20 20:11:32.641339384 +0000 UTC m=+24.396709891 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret") pod "global-pull-secret-syncer-d2467" (UID: "3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:28.821302 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:28.821269 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:28.821615 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:28.821379 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d2467" podUID="3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed" Apr 20 20:11:29.820282 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:29.819897 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:29.820403 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:29.819909 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:29.820403 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:29.820390 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:11:29.820513 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:29.820438 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hftxf" podUID="0e633d12-e3fe-490f-b2ea-097490061435" Apr 20 20:11:29.986595 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:29.986564 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:11:29.987237 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:29.986850 2580 generic.go:358] "Generic (PLEG): container finished" podID="2bc1e339-b5d7-4ff2-81cc-110408fe4e5f" containerID="d39370777acb31453f901a97631aa5c2d5264878ff8d9f33c8973133787105b5" exitCode=1 Apr 20 20:11:29.987237 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:29.986916 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" event={"ID":"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f","Type":"ContainerStarted","Data":"521e100b6414b5994b9638464c83779e4adbaab1d8531f88c44616ea48bde847"} Apr 20 20:11:29.987237 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:29.986969 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" event={"ID":"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f","Type":"ContainerStarted","Data":"e1bd71bdc079567e851050854393c04868f25d5eadef6454bbc780782cc3a8bb"} Apr 20 20:11:29.987237 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:29.986981 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" event={"ID":"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f","Type":"ContainerStarted","Data":"f3c9d40ebc8cddf40aafb9978033cc7ee14987cdcce0131221c4585c633d3865"} Apr 20 20:11:29.987237 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:29.986990 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" event={"ID":"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f","Type":"ContainerStarted","Data":"b655fad5ac67118068b1a390487cae0fc4b3d52348478517532bac943f54aaf0"} Apr 20 20:11:29.987237 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:29.987002 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" event={"ID":"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f","Type":"ContainerDied","Data":"d39370777acb31453f901a97631aa5c2d5264878ff8d9f33c8973133787105b5"} Apr 20 20:11:29.987237 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:29.987016 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" event={"ID":"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f","Type":"ContainerStarted","Data":"a444413ef59937ae394638aacbec6072d22aa5052bdb7762903136d2b7e29bb8"} Apr 20 20:11:29.988078 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:29.988052 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d96qr" event={"ID":"803590cd-665f-48a2-83de-4637da6a6b00","Type":"ContainerStarted","Data":"cb5e53a16bffa9dff9cbeeaa22121887278c227ec7fda9960be2c5bf5e896d5c"} Apr 20 20:11:29.989249 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:29.989222 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jfpm5" event={"ID":"ba687eca-e6d2-4355-91df-eb1ca17741fe","Type":"ContainerStarted","Data":"7e75d5fe512b271c972fdcce9a22e18e406619f16eefb54b2926548bbe952a51"} Apr 20 20:11:29.990447 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:29.990426 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" event={"ID":"ffa5dc35-4231-402c-85a6-9ae4ea55a914","Type":"ContainerStarted","Data":"4b1087538f7257a3345d7b5eaa61fb4c7043f18007b12721cb2e2ad10bee3692"} Apr 20 20:11:29.991654 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:29.991634 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-183.ec2.internal" event={"ID":"01a299fcf64ba455a08a0abdcaf7bcdd","Type":"ContainerStarted","Data":"795b10a5717ed90b91e21cb5c272babd6d494008e9a6412ee00c9e0982299bd6"} Apr 20 20:11:29.992857 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:29.992836 2580 generic.go:358] "Generic (PLEG): container finished" podID="e1816a504f93ff56ac217cacae748395" containerID="4a0eb8155dc2842b12966ab6593f80f96460acb925f595d65c14157bb1c60c77" exitCode=0 Apr 20 20:11:29.992919 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:29.992869 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal" event={"ID":"e1816a504f93ff56ac217cacae748395","Type":"ContainerDied","Data":"4a0eb8155dc2842b12966ab6593f80f96460acb925f595d65c14157bb1c60c77"} Apr 20 20:11:30.001877 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:30.000304 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-d96qr" podStartSLOduration=3.038390279 podStartE2EDuration="22.000289998s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.016020124 +0000 UTC m=+1.771390632" lastFinishedPulling="2026-04-20 20:11:28.97791984 +0000 UTC m=+20.733290351" observedRunningTime="2026-04-20 20:11:29.999845588 +0000 UTC m=+21.755216117" watchObservedRunningTime="2026-04-20 20:11:30.000289998 +0000 UTC m=+21.755660531" Apr 20 20:11:30.023106 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:30.023071 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-183.ec2.internal" podStartSLOduration=22.023060006 podStartE2EDuration="22.023060006s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:11:30.02294145 +0000 UTC m=+21.778311977" watchObservedRunningTime="2026-04-20 20:11:30.023060006 +0000 UTC m=+21.778430535" Apr 20 20:11:30.036155 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:30.036119 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qf9zb" podStartSLOduration=3.147784115 podStartE2EDuration="22.036105248s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.062991734 +0000 UTC m=+1.818362258" lastFinishedPulling="2026-04-20 20:11:28.951312882 +0000 UTC m=+20.706683391" observedRunningTime="2026-04-20 20:11:30.035761596 +0000 UTC m=+21.791132126" watchObservedRunningTime="2026-04-20 20:11:30.036105248 +0000 UTC m=+21.791475777" Apr 20 20:11:30.050766 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:30.050729 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jfpm5" podStartSLOduration=3.191176004 podStartE2EDuration="22.050707113s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.115198141 +0000 UTC m=+1.870568655" lastFinishedPulling="2026-04-20 20:11:28.974729243 +0000 UTC m=+20.730099764" observedRunningTime="2026-04-20 20:11:30.050168779 +0000 UTC m=+21.805539310" watchObservedRunningTime="2026-04-20 20:11:30.050707113 +0000 UTC m=+21.806077643" Apr 20 20:11:30.820248 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:30.820219 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:30.820414 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:30.820338 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d2467" podUID="3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed" Apr 20 20:11:30.956932 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:30.956903 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 20:11:30.995491 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:30.995465 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vrxxm" event={"ID":"8b242afd-b68e-4166-9b12-dcd12f17eca0","Type":"ContainerStarted","Data":"6a0e0f0646fbe917cfc83aaee362a5606b0570831996e1cfb290fa35441f239e"} Apr 20 20:11:30.996786 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:30.996765 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hs9bp" event={"ID":"a3e3e1bd-aa60-4683-a87b-474021dc8a77","Type":"ContainerStarted","Data":"8fecf56c9d2f2bda9f3cb16e3f6cb47271ee33188120dc765cb79b7edc37d3f6"} Apr 20 20:11:30.998056 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:30.998036 2580 generic.go:358] "Generic (PLEG): container finished" podID="4682a90f-335e-4cef-bd3a-448c0f2a267f" containerID="5fda8f6fea9af19902c460190e3131ce9d4daab7e03796b6811f37c2ec6fb307" exitCode=0 Apr 20 20:11:30.998148 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:30.998099 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mql6h" event={"ID":"4682a90f-335e-4cef-bd3a-448c0f2a267f","Type":"ContainerDied","Data":"5fda8f6fea9af19902c460190e3131ce9d4daab7e03796b6811f37c2ec6fb307"} Apr 20 20:11:30.999384 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:30.999359 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qgfkv" event={"ID":"53da44a9-d18c-47a4-b3e2-f5b196e47cbe","Type":"ContainerStarted","Data":"013b9a09ef214909b57ca4a77a993d5a14070cfdf76a88d981e68f581b86690d"} Apr 20 20:11:31.001038 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:31.001020 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" event={"ID":"815d5998-2dfe-4a37-8d71-80fd71ad5a3f","Type":"ContainerStarted","Data":"a2c7e6c77bee2821d8e3ceced48deb14f48506c642dc2dc4cc17c41d3308c457"} Apr 20 20:11:31.001117 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:31.001044 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" event={"ID":"815d5998-2dfe-4a37-8d71-80fd71ad5a3f","Type":"ContainerStarted","Data":"b0a4179007d6b9d5409a561452868833d9045671bcce1fe93778158d2dcd0ed2"} Apr 20 20:11:31.002507 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:31.002455 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal" event={"ID":"e1816a504f93ff56ac217cacae748395","Type":"ContainerStarted","Data":"d190bd2d493c9b4b7040b5a59e45c46249822b8d9775b4695b06973ae52b40ba"} Apr 20 20:11:31.018558 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:31.018522 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vrxxm" podStartSLOduration=4.2621915040000005 podStartE2EDuration="23.018511251s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.194803503 +0000 UTC m=+1.950174011" lastFinishedPulling="2026-04-20 20:11:28.951123232 +0000 UTC m=+20.706493758" observedRunningTime="2026-04-20 20:11:31.007007734 +0000 UTC m=+22.762378263" watchObservedRunningTime="2026-04-20 20:11:31.018511251 +0000 UTC m=+22.773881836" Apr 20 20:11:31.018686 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:31.018668 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qgfkv" podStartSLOduration=4.100071122 podStartE2EDuration="23.018664355s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.032444523 +0000 UTC m=+1.787815031" lastFinishedPulling="2026-04-20 20:11:28.951037756 +0000 UTC m=+20.706408264" observedRunningTime="2026-04-20 20:11:31.01821909 +0000 UTC m=+22.773589620" watchObservedRunningTime="2026-04-20 20:11:31.018664355 +0000 UTC m=+22.774034885" Apr 20 20:11:31.063204 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:31.063172 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hs9bp" podStartSLOduration=4.283825311 podStartE2EDuration="23.063162482s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.194636079 +0000 UTC m=+1.950006598" lastFinishedPulling="2026-04-20 20:11:28.973973246 +0000 UTC m=+20.729343769" observedRunningTime="2026-04-20 20:11:31.06303075 +0000 UTC m=+22.818401280" watchObservedRunningTime="2026-04-20 20:11:31.063162482 +0000 UTC m=+22.818533011" Apr 20 20:11:31.077816 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:31.077764 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-183.ec2.internal" podStartSLOduration=23.077754977 podStartE2EDuration="23.077754977s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:11:31.077625469 +0000 UTC m=+22.832995999" watchObservedRunningTime="2026-04-20 20:11:31.077754977 +0000 UTC m=+22.833125506" Apr 20 20:11:31.151788 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:31.151764 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vrxxm" Apr 20 20:11:31.152275 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:31.152259 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vrxxm" Apr 20 20:11:31.764625 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:31.764515 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T20:11:30.95691958Z","UUID":"9c1b4ba5-7603-4435-b45b-4d0ae3c3cdf5","Handler":null,"Name":"","Endpoint":""} Apr 20 20:11:31.767378 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:31.767352 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 20:11:31.767509 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:31.767387 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 20:11:31.820631 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:31.820606 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:31.820775 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:31.820608 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:31.820775 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:31.820722 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hftxf" podUID="0e633d12-e3fe-490f-b2ea-097490061435" Apr 20 20:11:31.820897 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:31.820811 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:11:32.009735 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:32.009552 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:11:32.010243 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:32.010220 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" event={"ID":"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f","Type":"ContainerStarted","Data":"ebb10bf86dddd7943048deeb497034ff66b36af7988e421584656080b85308c1"} Apr 20 20:11:32.011179 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:32.010994 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vrxxm" Apr 20 20:11:32.011702 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:32.011289 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vrxxm" Apr 20 20:11:32.668841 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:32.668806 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret\") pod \"global-pull-secret-syncer-d2467\" (UID: \"3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed\") " pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:32.669076 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:32.668969 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:32.669076 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:32.669041 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret podName:3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed nodeName:}" failed. No retries permitted until 2026-04-20 20:11:40.669026771 +0000 UTC m=+32.424397278 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret") pod "global-pull-secret-syncer-d2467" (UID: "3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:32.820271 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:32.820237 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:32.820438 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:32.820368 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d2467" podUID="3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed" Apr 20 20:11:33.014182 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:33.014149 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" event={"ID":"815d5998-2dfe-4a37-8d71-80fd71ad5a3f","Type":"ContainerStarted","Data":"e35fbd9a63daa768aca88037d330d24616b2c52d63cc3b1ca66a38cc1b57db25"} Apr 20 20:11:33.031724 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:33.031670 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qqsvt" podStartSLOduration=2.877112907 podStartE2EDuration="25.031654227s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.011024365 +0000 UTC m=+1.766394878" lastFinishedPulling="2026-04-20 20:11:32.165565682 +0000 UTC m=+23.920936198" observedRunningTime="2026-04-20 20:11:33.031577822 +0000 UTC m=+24.786948352" watchObservedRunningTime="2026-04-20 20:11:33.031654227 +0000 UTC m=+24.787024758" Apr 20 20:11:33.820498 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:33.820048 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:33.820498 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:33.820078 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:33.820498 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:33.820167 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hftxf" podUID="0e633d12-e3fe-490f-b2ea-097490061435" Apr 20 20:11:33.820498 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:33.820318 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:11:34.820472 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:34.820437 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:34.820941 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:34.820569 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d2467" podUID="3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed" Apr 20 20:11:35.820105 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:35.819901 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:35.820261 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:35.819913 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:35.820323 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:35.820174 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hftxf" podUID="0e633d12-e3fe-490f-b2ea-097490061435" Apr 20 20:11:35.820323 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:35.820295 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:11:36.019998 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:36.019970 2580 generic.go:358] "Generic (PLEG): container finished" podID="4682a90f-335e-4cef-bd3a-448c0f2a267f" containerID="c3241ff7929f47bf06982a9beb8903f25702217351e12402a5b1c936b1226072" exitCode=0 Apr 20 20:11:36.020520 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:36.020050 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mql6h" event={"ID":"4682a90f-335e-4cef-bd3a-448c0f2a267f","Type":"ContainerDied","Data":"c3241ff7929f47bf06982a9beb8903f25702217351e12402a5b1c936b1226072"} Apr 20 20:11:36.023195 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:36.023154 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:11:36.023598 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:36.023494 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" event={"ID":"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f","Type":"ContainerStarted","Data":"a11c47f502b82d272e528ead1db03d349d5e5db94769827930f26b4faf8ad196"} Apr 20 20:11:36.023796 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:36.023781 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:36.023859 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:36.023805 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:36.023859 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:36.023819 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:36.023979 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:36.023936 2580 scope.go:117] "RemoveContainer" containerID="d39370777acb31453f901a97631aa5c2d5264878ff8d9f33c8973133787105b5" Apr 20 20:11:36.040414 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:36.040395 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:36.041632 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:36.041613 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:11:36.820682 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:36.820649 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:36.820903 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:36.820789 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d2467" podUID="3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed" Apr 20 20:11:37.028493 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:37.028461 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:11:37.028816 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:37.028792 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" event={"ID":"2bc1e339-b5d7-4ff2-81cc-110408fe4e5f","Type":"ContainerStarted","Data":"8fefb0810fd1e2cf157ded9edce5b8da2d85be3076c8c973d584c597cfbe3626"} Apr 20 20:11:37.056000 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:37.055933 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" podStartSLOduration=9.834630281 podStartE2EDuration="29.055913808s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.053922318 +0000 UTC m=+1.809292825" lastFinishedPulling="2026-04-20 20:11:29.275205844 +0000 UTC m=+21.030576352" observedRunningTime="2026-04-20 20:11:37.054096761 +0000 UTC m=+28.809467290" watchObservedRunningTime="2026-04-20 20:11:37.055913808 +0000 UTC m=+28.811284434" Apr 20 20:11:37.269973 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:37.269790 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hftxf"] Apr 20 20:11:37.270134 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:37.270061 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:37.270205 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:37.270186 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hftxf" podUID="0e633d12-e3fe-490f-b2ea-097490061435" Apr 20 20:11:37.272740 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:37.272715 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-d2467"] Apr 20 20:11:37.272865 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:37.272818 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:37.272934 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:37.272913 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d2467" podUID="3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed" Apr 20 20:11:37.273319 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:37.273302 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z9tzr"] Apr 20 20:11:37.273406 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:37.273390 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:37.273477 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:37.273461 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:11:38.031924 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:38.031894 2580 generic.go:358] "Generic (PLEG): container finished" podID="4682a90f-335e-4cef-bd3a-448c0f2a267f" containerID="8edc4491cae4ca20cbeb4f065ccb704ca2933d7259b263fe652890a59bdc3c83" exitCode=0 Apr 20 20:11:38.032318 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:38.031990 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mql6h" event={"ID":"4682a90f-335e-4cef-bd3a-448c0f2a267f","Type":"ContainerDied","Data":"8edc4491cae4ca20cbeb4f065ccb704ca2933d7259b263fe652890a59bdc3c83"} Apr 20 20:11:38.821253 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:38.821176 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:38.821400 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:38.821280 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:11:38.821400 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:38.821358 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:38.821507 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:38.821441 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hftxf" podUID="0e633d12-e3fe-490f-b2ea-097490061435" Apr 20 20:11:38.821507 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:38.821475 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:38.821597 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:38.821526 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d2467" podUID="3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed" Apr 20 20:11:39.035483 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:39.035450 2580 generic.go:358] "Generic (PLEG): container finished" podID="4682a90f-335e-4cef-bd3a-448c0f2a267f" containerID="b4755270eb0f561cb1af1fa1349bd16f14bc7b8818c7bb7ffa229d8bb105d550" exitCode=0 Apr 20 20:11:39.035483 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:39.035480 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mql6h" event={"ID":"4682a90f-335e-4cef-bd3a-448c0f2a267f","Type":"ContainerDied","Data":"b4755270eb0f561cb1af1fa1349bd16f14bc7b8818c7bb7ffa229d8bb105d550"} Apr 20 20:11:40.732099 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:40.732021 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret\") pod \"global-pull-secret-syncer-d2467\" (UID: \"3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed\") " pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:40.732572 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:40.732166 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:40.732572 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:40.732247 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret podName:3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed nodeName:}" failed. No retries permitted until 2026-04-20 20:11:56.732225716 +0000 UTC m=+48.487596238 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret") pod "global-pull-secret-syncer-d2467" (UID: "3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:40.820370 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:40.820338 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:40.820544 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:40.820451 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hftxf" podUID="0e633d12-e3fe-490f-b2ea-097490061435" Apr 20 20:11:40.820544 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:40.820344 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:40.820780 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:40.820546 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d2467" podUID="3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed" Apr 20 20:11:40.820780 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:40.820338 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:40.820780 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:40.820637 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:11:41.436314 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:41.436279 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs\") pod \"network-metrics-daemon-z9tzr\" (UID: \"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9\") " pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:41.436484 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:41.436448 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:41.436530 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:41.436522 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs podName:ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:13.436502289 +0000 UTC m=+65.191872811 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs") pod "network-metrics-daemon-z9tzr" (UID: "ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:41.537368 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:41.537336 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5z56\" (UniqueName: \"kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56\") pod \"network-check-target-hftxf\" (UID: \"0e633d12-e3fe-490f-b2ea-097490061435\") " pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:41.537553 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:41.537534 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:41.537614 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:41.537559 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:41.537614 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:41.537569 2580 projected.go:194] Error preparing data for projected volume kube-api-access-r5z56 for pod openshift-network-diagnostics/network-check-target-hftxf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:41.537614 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:41.537613 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56 podName:0e633d12-e3fe-490f-b2ea-097490061435 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:13.537600183 +0000 UTC m=+65.292970696 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-r5z56" (UniqueName: "kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56") pod "network-check-target-hftxf" (UID: "0e633d12-e3fe-490f-b2ea-097490061435") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:42.059135 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.059103 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-183.ec2.internal" event="NodeReady" Apr 20 20:11:42.059564 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.059260 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 20:11:42.100273 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.100234 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6b4d9"] Apr 20 20:11:42.107186 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.107156 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cxb87"] Apr 20 20:11:42.107386 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.107364 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6b4d9" Apr 20 20:11:42.110010 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.109983 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cxb87" Apr 20 20:11:42.110131 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.110118 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 20:11:42.110682 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.110637 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bkrnz\"" Apr 20 20:11:42.110807 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.110643 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 20:11:42.111966 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.111896 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6b4d9"] Apr 20 20:11:42.112711 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.112687 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 20:11:42.113096 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.113077 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 20:11:42.113863 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.113119 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 20:11:42.113863 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.113387 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rfvs6\"" Apr 20 20:11:42.114741 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.114722 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cxb87"] Apr 20 20:11:42.241265 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.241203 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-config-volume\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:11:42.241265 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.241248 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:11:42.241265 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.241267 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-789hp\" (UniqueName: \"kubernetes.io/projected/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-kube-api-access-789hp\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:11:42.241585 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.241361 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert\") pod \"ingress-canary-cxb87\" (UID: \"ba7ed180-9b68-40c1-9f30-e9a6a5c96af3\") " pod="openshift-ingress-canary/ingress-canary-cxb87" Apr 20 20:11:42.241585 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.241416 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-tmp-dir\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:11:42.241585 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.241452 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98r4p\" (UniqueName: \"kubernetes.io/projected/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-kube-api-access-98r4p\") pod \"ingress-canary-cxb87\" (UID: \"ba7ed180-9b68-40c1-9f30-e9a6a5c96af3\") " pod="openshift-ingress-canary/ingress-canary-cxb87" Apr 20 20:11:42.342530 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.342494 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-tmp-dir\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:11:42.342800 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.342565 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98r4p\" (UniqueName: \"kubernetes.io/projected/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-kube-api-access-98r4p\") pod \"ingress-canary-cxb87\" (UID: \"ba7ed180-9b68-40c1-9f30-e9a6a5c96af3\") " pod="openshift-ingress-canary/ingress-canary-cxb87" Apr 20 20:11:42.342800 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.342620 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-config-volume\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:11:42.342800 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.342651 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:11:42.342800 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.342671 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-789hp\" (UniqueName: \"kubernetes.io/projected/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-kube-api-access-789hp\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:11:42.342800 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.342698 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert\") pod \"ingress-canary-cxb87\" (UID: \"ba7ed180-9b68-40c1-9f30-e9a6a5c96af3\") " pod="openshift-ingress-canary/ingress-canary-cxb87" Apr 20 20:11:42.343102 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:42.342827 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:11:42.343102 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:42.342827 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:11:42.343102 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:42.342898 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls podName:7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:42.842876242 +0000 UTC m=+34.598246765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls") pod "dns-default-6b4d9" (UID: "7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47") : secret "dns-default-metrics-tls" not found Apr 20 20:11:42.343102 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:42.342994 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert podName:ba7ed180-9b68-40c1-9f30-e9a6a5c96af3 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:42.842972959 +0000 UTC m=+34.598343468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert") pod "ingress-canary-cxb87" (UID: "ba7ed180-9b68-40c1-9f30-e9a6a5c96af3") : secret "canary-serving-cert" not found Apr 20 20:11:42.343102 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.342998 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-tmp-dir\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:11:42.343987 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.343939 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-config-volume\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:11:42.355054 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.354840 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-789hp\" (UniqueName: \"kubernetes.io/projected/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-kube-api-access-789hp\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:11:42.355212 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.354891 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98r4p\" (UniqueName: \"kubernetes.io/projected/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-kube-api-access-98r4p\") pod \"ingress-canary-cxb87\" (UID: \"ba7ed180-9b68-40c1-9f30-e9a6a5c96af3\") " pod="openshift-ingress-canary/ingress-canary-cxb87" Apr 20 20:11:42.820987 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.820625 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:42.820987 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.820665 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:11:42.820987 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.820798 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:11:42.824427 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.824391 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:11:42.824427 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.824410 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:11:42.824623 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.824462 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-698bz\"" Apr 20 20:11:42.825742 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.825720 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:11:42.826011 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.825997 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 20:11:42.826210 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.826195 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-779fg\"" Apr 20 20:11:42.846635 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.846600 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:11:42.846814 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:42.846665 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert\") pod \"ingress-canary-cxb87\" (UID: \"ba7ed180-9b68-40c1-9f30-e9a6a5c96af3\") " pod="openshift-ingress-canary/ingress-canary-cxb87" Apr 20 20:11:42.846814 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:42.846786 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:11:42.846914 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:42.846876 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls podName:7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:43.846855655 +0000 UTC m=+35.602226163 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls") pod "dns-default-6b4d9" (UID: "7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47") : secret "dns-default-metrics-tls" not found Apr 20 20:11:42.846914 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:42.846786 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:11:42.847025 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:42.846966 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert podName:ba7ed180-9b68-40c1-9f30-e9a6a5c96af3 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:43.846930933 +0000 UTC m=+35.602301446 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert") pod "ingress-canary-cxb87" (UID: "ba7ed180-9b68-40c1-9f30-e9a6a5c96af3") : secret "canary-serving-cert" not found Apr 20 20:11:43.855393 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:43.855356 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:11:43.855393 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:43.855407 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert\") pod \"ingress-canary-cxb87\" (UID: \"ba7ed180-9b68-40c1-9f30-e9a6a5c96af3\") " pod="openshift-ingress-canary/ingress-canary-cxb87" Apr 20 20:11:43.855835 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:43.855524 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:11:43.855835 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:43.855561 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:11:43.855835 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:43.855628 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls podName:7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:45.855590845 +0000 UTC m=+37.610961369 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls") pod "dns-default-6b4d9" (UID: "7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47") : secret "dns-default-metrics-tls" not found Apr 20 20:11:43.855835 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:43.855646 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert podName:ba7ed180-9b68-40c1-9f30-e9a6a5c96af3 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:45.855639561 +0000 UTC m=+37.611010069 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert") pod "ingress-canary-cxb87" (UID: "ba7ed180-9b68-40c1-9f30-e9a6a5c96af3") : secret "canary-serving-cert" not found Apr 20 20:11:45.054295 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:45.054206 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mql6h" event={"ID":"4682a90f-335e-4cef-bd3a-448c0f2a267f","Type":"ContainerStarted","Data":"5f500c058af55a0342bd595761c3529371e6b46b251c422f428d2a7d57a29dc8"} Apr 20 20:11:45.871185 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:45.871137 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:11:45.871393 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:45.871198 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert\") pod \"ingress-canary-cxb87\" (UID: \"ba7ed180-9b68-40c1-9f30-e9a6a5c96af3\") " pod="openshift-ingress-canary/ingress-canary-cxb87" Apr 20 20:11:45.871393 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:45.871292 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:11:45.871393 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:45.871371 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls podName:7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:49.871352641 +0000 UTC m=+41.626723149 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls") pod "dns-default-6b4d9" (UID: "7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47") : secret "dns-default-metrics-tls" not found Apr 20 20:11:45.871393 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:45.871385 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:11:45.871574 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:45.871450 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert podName:ba7ed180-9b68-40c1-9f30-e9a6a5c96af3 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:49.871431899 +0000 UTC m=+41.626802408 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert") pod "ingress-canary-cxb87" (UID: "ba7ed180-9b68-40c1-9f30-e9a6a5c96af3") : secret "canary-serving-cert" not found Apr 20 20:11:46.058190 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:46.058156 2580 generic.go:358] "Generic (PLEG): container finished" podID="4682a90f-335e-4cef-bd3a-448c0f2a267f" containerID="5f500c058af55a0342bd595761c3529371e6b46b251c422f428d2a7d57a29dc8" exitCode=0 Apr 20 20:11:46.058622 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:46.058223 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mql6h" event={"ID":"4682a90f-335e-4cef-bd3a-448c0f2a267f","Type":"ContainerDied","Data":"5f500c058af55a0342bd595761c3529371e6b46b251c422f428d2a7d57a29dc8"} Apr 20 20:11:47.062338 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:47.062303 2580 generic.go:358] "Generic (PLEG): container finished" podID="4682a90f-335e-4cef-bd3a-448c0f2a267f" containerID="c27caf38485a3ee2cd7b31fedda534b747e16e60a71b964fb2c8c26c6a6f2c33" exitCode=0 Apr 20 20:11:47.062789 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:47.062376 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mql6h" event={"ID":"4682a90f-335e-4cef-bd3a-448c0f2a267f","Type":"ContainerDied","Data":"c27caf38485a3ee2cd7b31fedda534b747e16e60a71b964fb2c8c26c6a6f2c33"} Apr 20 20:11:48.067716 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:48.067682 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mql6h" event={"ID":"4682a90f-335e-4cef-bd3a-448c0f2a267f","Type":"ContainerStarted","Data":"8aac933eb9027ac2e924e5c5f623c61f5eeec11cb62afdc89b1c1c0e0dbbc065"} Apr 20 20:11:48.090470 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:48.090420 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mql6h" podStartSLOduration=5.390419104 podStartE2EDuration="40.090407127s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.09057912 +0000 UTC m=+1.845949628" lastFinishedPulling="2026-04-20 20:11:44.790567142 +0000 UTC m=+36.545937651" observedRunningTime="2026-04-20 20:11:48.088726718 +0000 UTC m=+39.844097239" watchObservedRunningTime="2026-04-20 20:11:48.090407127 +0000 UTC m=+39.845777656" Apr 20 20:11:49.903764 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:49.903724 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:11:49.903764 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:49.903773 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert\") pod \"ingress-canary-cxb87\" (UID: \"ba7ed180-9b68-40c1-9f30-e9a6a5c96af3\") " pod="openshift-ingress-canary/ingress-canary-cxb87" Apr 20 20:11:49.904219 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:49.903864 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:11:49.904219 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:49.903873 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:11:49.904219 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:49.903913 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert podName:ba7ed180-9b68-40c1-9f30-e9a6a5c96af3 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:57.903899296 +0000 UTC m=+49.659269804 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert") pod "ingress-canary-cxb87" (UID: "ba7ed180-9b68-40c1-9f30-e9a6a5c96af3") : secret "canary-serving-cert" not found Apr 20 20:11:49.904219 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:49.903927 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls podName:7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:57.90391962 +0000 UTC m=+49.659290129 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls") pod "dns-default-6b4d9" (UID: "7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47") : secret "dns-default-metrics-tls" not found Apr 20 20:11:56.755742 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:56.755697 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret\") pod \"global-pull-secret-syncer-d2467\" (UID: \"3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed\") " pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:56.759060 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:56.759025 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed-original-pull-secret\") pod \"global-pull-secret-syncer-d2467\" (UID: \"3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed\") " pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:56.933576 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:56.933535 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d2467" Apr 20 20:11:57.115171 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:57.115140 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-d2467"] Apr 20 20:11:57.118811 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:11:57.118781 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c8bb7c5_7cf9_4232_8a6e_83bfb1fc0bed.slice/crio-0f11a44ee16190fcdb59a152e338b6bc02ef847904775ae3c1ff4cfc4d8fee57 WatchSource:0}: Error finding container 0f11a44ee16190fcdb59a152e338b6bc02ef847904775ae3c1ff4cfc4d8fee57: Status 404 returned error can't find the container with id 0f11a44ee16190fcdb59a152e338b6bc02ef847904775ae3c1ff4cfc4d8fee57 Apr 20 20:11:57.964522 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:57.964484 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:11:57.964995 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:57.964554 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert\") pod \"ingress-canary-cxb87\" (UID: \"ba7ed180-9b68-40c1-9f30-e9a6a5c96af3\") " pod="openshift-ingress-canary/ingress-canary-cxb87" Apr 20 20:11:57.964995 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:57.964700 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:11:57.964995 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:57.964740 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:11:57.964995 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:57.964792 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert podName:ba7ed180-9b68-40c1-9f30-e9a6a5c96af3 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:13.9647722 +0000 UTC m=+65.720142709 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert") pod "ingress-canary-cxb87" (UID: "ba7ed180-9b68-40c1-9f30-e9a6a5c96af3") : secret "canary-serving-cert" not found Apr 20 20:11:57.964995 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:11:57.964812 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls podName:7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:13.964802017 +0000 UTC m=+65.720172525 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls") pod "dns-default-6b4d9" (UID: "7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47") : secret "dns-default-metrics-tls" not found Apr 20 20:11:58.087622 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:11:58.087581 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-d2467" event={"ID":"3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed","Type":"ContainerStarted","Data":"0f11a44ee16190fcdb59a152e338b6bc02ef847904775ae3c1ff4cfc4d8fee57"} Apr 20 20:12:02.097509 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:02.097461 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-d2467" event={"ID":"3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed","Type":"ContainerStarted","Data":"04a24533c7cf7d08d93714e2ba2b7b58d933c41bd68a6e4125576c86b02f1b87"} Apr 20 20:12:02.114861 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:02.114778 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-d2467" podStartSLOduration=34.209803065 podStartE2EDuration="38.114762139s" podCreationTimestamp="2026-04-20 20:11:24 +0000 UTC" firstStartedPulling="2026-04-20 20:11:57.120836965 +0000 UTC m=+48.876207474" lastFinishedPulling="2026-04-20 20:12:01.025796028 +0000 UTC m=+52.781166548" observedRunningTime="2026-04-20 20:12:02.114383186 +0000 UTC m=+53.869753715" watchObservedRunningTime="2026-04-20 20:12:02.114762139 +0000 UTC m=+53.870132706" Apr 20 20:12:08.043520 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:08.043480 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k9hml" Apr 20 20:12:13.471408 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:13.471362 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs\") pod \"network-metrics-daemon-z9tzr\" (UID: \"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9\") " pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:12:13.474232 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:13.474211 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:12:13.482501 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:12:13.482472 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:12:13.482584 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:12:13.482554 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs podName:ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:17.482536003 +0000 UTC m=+129.237906511 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs") pod "network-metrics-daemon-z9tzr" (UID: "ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9") : secret "metrics-daemon-secret" not found Apr 20 20:12:13.571795 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:13.571755 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5z56\" (UniqueName: \"kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56\") pod \"network-check-target-hftxf\" (UID: \"0e633d12-e3fe-490f-b2ea-097490061435\") " pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:12:13.574723 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:13.574703 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:12:13.585611 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:13.585585 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:12:13.596413 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:13.596379 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5z56\" (UniqueName: \"kubernetes.io/projected/0e633d12-e3fe-490f-b2ea-097490061435-kube-api-access-r5z56\") pod \"network-check-target-hftxf\" (UID: \"0e633d12-e3fe-490f-b2ea-097490061435\") " pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:12:13.750455 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:13.750422 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-698bz\"" Apr 20 20:12:13.758706 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:13.758673 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:12:13.879248 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:13.879216 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hftxf"] Apr 20 20:12:13.882910 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:12:13.882882 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e633d12_e3fe_490f_b2ea_097490061435.slice/crio-383a1372fdde6a23516cf7c752563b2a32977c20909e631a32938dd819a04e27 WatchSource:0}: Error finding container 383a1372fdde6a23516cf7c752563b2a32977c20909e631a32938dd819a04e27: Status 404 returned error can't find the container with id 383a1372fdde6a23516cf7c752563b2a32977c20909e631a32938dd819a04e27 Apr 20 20:12:13.975109 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:13.975074 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:12:13.975293 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:13.975131 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert\") pod \"ingress-canary-cxb87\" (UID: \"ba7ed180-9b68-40c1-9f30-e9a6a5c96af3\") " pod="openshift-ingress-canary/ingress-canary-cxb87" Apr 20 20:12:13.975293 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:12:13.975241 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:12:13.975293 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:12:13.975244 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:12:13.975415 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:12:13.975298 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert podName:ba7ed180-9b68-40c1-9f30-e9a6a5c96af3 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:45.97528367 +0000 UTC m=+97.730654179 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert") pod "ingress-canary-cxb87" (UID: "ba7ed180-9b68-40c1-9f30-e9a6a5c96af3") : secret "canary-serving-cert" not found Apr 20 20:12:13.975415 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:12:13.975314 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls podName:7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:45.975306173 +0000 UTC m=+97.730676680 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls") pod "dns-default-6b4d9" (UID: "7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47") : secret "dns-default-metrics-tls" not found Apr 20 20:12:14.123111 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:14.122933 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hftxf" event={"ID":"0e633d12-e3fe-490f-b2ea-097490061435","Type":"ContainerStarted","Data":"383a1372fdde6a23516cf7c752563b2a32977c20909e631a32938dd819a04e27"} Apr 20 20:12:17.130668 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:17.130633 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hftxf" event={"ID":"0e633d12-e3fe-490f-b2ea-097490061435","Type":"ContainerStarted","Data":"36eef920668d9cfc520c686fa58616046d23e9f6b51a937c537c2cfdf5add7aa"} Apr 20 20:12:17.131104 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:17.130781 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:12:17.146596 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:17.146537 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hftxf" podStartSLOduration=66.374966231 podStartE2EDuration="1m9.146524409s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:12:13.885370413 +0000 UTC m=+65.640740923" lastFinishedPulling="2026-04-20 20:12:16.656928591 +0000 UTC m=+68.412299101" observedRunningTime="2026-04-20 20:12:17.146182167 +0000 UTC m=+68.901552697" watchObservedRunningTime="2026-04-20 20:12:17.146524409 +0000 UTC m=+68.901894939" Apr 20 20:12:28.746015 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.745979 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx"] Apr 20 20:12:28.749265 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.749244 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.752010 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.751980 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 20:12:28.752167 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.752092 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 20:12:28.752233 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.752164 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 20:12:28.752288 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.752241 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 20:12:28.752475 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.752457 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 20:12:28.753374 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.753353 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 20:12:28.753493 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.753478 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 20:12:28.757973 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.757936 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx"] Apr 20 20:12:28.769896 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.769869 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.769896 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.769900 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.770106 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.769937 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tv84\" (UniqueName: \"kubernetes.io/projected/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-kube-api-access-9tv84\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.770106 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.769983 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-hub\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.770106 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.770039 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-ca\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.770106 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.770055 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.870278 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.870245 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.870278 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.870277 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.870539 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.870307 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tv84\" (UniqueName: \"kubernetes.io/projected/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-kube-api-access-9tv84\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.870539 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.870334 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-hub\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.870539 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.870388 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-ca\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.870539 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.870414 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.870974 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.870937 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.873850 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.873817 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-ca\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.874006 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.873883 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-hub\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.874092 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.874070 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.874460 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.874441 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:28.879687 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:28.879656 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tv84\" (UniqueName: \"kubernetes.io/projected/8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf-kube-api-access-9tv84\") pod \"cluster-proxy-proxy-agent-5bf9cc7877-b92jx\" (UID: \"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:29.073347 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:29.073248 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:12:29.194668 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:29.194632 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx"] Apr 20 20:12:29.197942 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:12:29.197905 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e0ee80e_4f8c_43ce_a9fd_dc8ca98ee0bf.slice/crio-73d5cd858d411ceccc7c25f02a5008cd4febe1d5ef7797e6b8d524dc48955a02 WatchSource:0}: Error finding container 73d5cd858d411ceccc7c25f02a5008cd4febe1d5ef7797e6b8d524dc48955a02: Status 404 returned error can't find the container with id 73d5cd858d411ceccc7c25f02a5008cd4febe1d5ef7797e6b8d524dc48955a02 Apr 20 20:12:30.158201 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:30.158153 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" event={"ID":"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf","Type":"ContainerStarted","Data":"73d5cd858d411ceccc7c25f02a5008cd4febe1d5ef7797e6b8d524dc48955a02"} Apr 20 20:12:32.163234 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:32.163153 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" event={"ID":"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf","Type":"ContainerStarted","Data":"5c9f29a0b01ecc2c67d49dbde513629eb7a8106ca0287fb6f3880cd573270568"} Apr 20 20:12:34.169047 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:34.168973 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" event={"ID":"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf","Type":"ContainerStarted","Data":"7930f35a958027da3b9c72cf144e170121a46a7558e640a7dfc143549cdf83c0"} Apr 20 20:12:34.169047 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:34.169014 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" event={"ID":"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf","Type":"ContainerStarted","Data":"811f0115323336ab45cd72d9ff99c7c0826639e839fb48ee66e5f69a127e463c"} Apr 20 20:12:34.189808 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:34.189767 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" podStartSLOduration=1.499203213 podStartE2EDuration="6.189755404s" podCreationTimestamp="2026-04-20 20:12:28 +0000 UTC" firstStartedPulling="2026-04-20 20:12:29.200277315 +0000 UTC m=+80.955647824" lastFinishedPulling="2026-04-20 20:12:33.890829494 +0000 UTC m=+85.646200015" observedRunningTime="2026-04-20 20:12:34.188627241 +0000 UTC m=+85.943997770" watchObservedRunningTime="2026-04-20 20:12:34.189755404 +0000 UTC m=+85.945125934" Apr 20 20:12:45.989794 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:45.989748 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:12:45.990254 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:45.989818 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert\") pod \"ingress-canary-cxb87\" (UID: \"ba7ed180-9b68-40c1-9f30-e9a6a5c96af3\") " pod="openshift-ingress-canary/ingress-canary-cxb87" Apr 20 20:12:45.990254 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:12:45.989912 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:12:45.990254 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:12:45.989930 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:12:45.990254 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:12:45.990002 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls podName:7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:49.989985689 +0000 UTC m=+161.745356197 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls") pod "dns-default-6b4d9" (UID: "7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47") : secret "dns-default-metrics-tls" not found Apr 20 20:12:45.990254 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:12:45.990017 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert podName:ba7ed180-9b68-40c1-9f30-e9a6a5c96af3 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:49.990010904 +0000 UTC m=+161.745381411 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert") pod "ingress-canary-cxb87" (UID: "ba7ed180-9b68-40c1-9f30-e9a6a5c96af3") : secret "canary-serving-cert" not found Apr 20 20:12:48.134802 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:12:48.134770 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hftxf" Apr 20 20:13:03.028061 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:03.028031 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-d96qr_803590cd-665f-48a2-83de-4637da6a6b00/dns-node-resolver/0.log" Apr 20 20:13:04.028058 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:04.028025 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qgfkv_53da44a9-d18c-47a4-b3e2-f5b196e47cbe/node-ca/0.log" Apr 20 20:13:14.654529 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.654498 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-wv5kz"] Apr 20 20:13:14.656416 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.656398 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:14.659112 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.659084 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 20:13:14.659232 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.659213 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 20:13:14.659306 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.659218 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-stvsn\"" Apr 20 20:13:14.660482 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.660464 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 20:13:14.660570 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.660464 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 20:13:14.666508 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.666489 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wv5kz"] Apr 20 20:13:14.782767 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.782732 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:14.782767 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.782766 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwrvv\" (UniqueName: \"kubernetes.io/projected/ae0abd1a-d13c-4cc9-b401-6022ce717e69-kube-api-access-cwrvv\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:14.782983 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.782805 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ae0abd1a-d13c-4cc9-b401-6022ce717e69-crio-socket\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:14.782983 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.782851 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ae0abd1a-d13c-4cc9-b401-6022ce717e69-data-volume\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:14.782983 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.782877 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ae0abd1a-d13c-4cc9-b401-6022ce717e69-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:14.883767 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.883730 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ae0abd1a-d13c-4cc9-b401-6022ce717e69-crio-socket\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:14.883982 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.883778 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ae0abd1a-d13c-4cc9-b401-6022ce717e69-data-volume\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:14.883982 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.883804 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ae0abd1a-d13c-4cc9-b401-6022ce717e69-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:14.883982 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.883842 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:14.883982 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.883859 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwrvv\" (UniqueName: \"kubernetes.io/projected/ae0abd1a-d13c-4cc9-b401-6022ce717e69-kube-api-access-cwrvv\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:14.883982 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.883857 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ae0abd1a-d13c-4cc9-b401-6022ce717e69-crio-socket\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:14.883982 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:14.883975 2580 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 20:13:14.884277 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:14.884031 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls podName:ae0abd1a-d13c-4cc9-b401-6022ce717e69 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:15.384017536 +0000 UTC m=+127.139388045 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wv5kz" (UID: "ae0abd1a-d13c-4cc9-b401-6022ce717e69") : secret "insights-runtime-extractor-tls" not found Apr 20 20:13:14.884277 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.884197 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ae0abd1a-d13c-4cc9-b401-6022ce717e69-data-volume\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:14.884450 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.884432 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ae0abd1a-d13c-4cc9-b401-6022ce717e69-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:14.894050 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:14.894027 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwrvv\" (UniqueName: \"kubernetes.io/projected/ae0abd1a-d13c-4cc9-b401-6022ce717e69-kube-api-access-cwrvv\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:15.386944 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:15.386906 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:15.387131 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:15.387049 2580 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 20:13:15.387131 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:15.387130 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls podName:ae0abd1a-d13c-4cc9-b401-6022ce717e69 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:16.387112613 +0000 UTC m=+128.142483121 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wv5kz" (UID: "ae0abd1a-d13c-4cc9-b401-6022ce717e69") : secret "insights-runtime-extractor-tls" not found Apr 20 20:13:16.394668 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:16.394619 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:16.395092 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:16.394766 2580 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 20:13:16.395092 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:16.394844 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls podName:ae0abd1a-d13c-4cc9-b401-6022ce717e69 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:18.3948267 +0000 UTC m=+130.150197208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wv5kz" (UID: "ae0abd1a-d13c-4cc9-b401-6022ce717e69") : secret "insights-runtime-extractor-tls" not found Apr 20 20:13:17.501591 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:17.501555 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs\") pod \"network-metrics-daemon-z9tzr\" (UID: \"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9\") " pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:13:17.501987 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:17.501661 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:13:17.501987 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:17.501732 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs podName:ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9 nodeName:}" failed. No retries permitted until 2026-04-20 20:15:19.501718254 +0000 UTC m=+251.257088763 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs") pod "network-metrics-daemon-z9tzr" (UID: "ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9") : secret "metrics-daemon-secret" not found Apr 20 20:13:18.408532 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:18.408479 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:18.408714 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:18.408635 2580 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 20:13:18.408714 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:18.408700 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls podName:ae0abd1a-d13c-4cc9-b401-6022ce717e69 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:22.408685081 +0000 UTC m=+134.164055589 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wv5kz" (UID: "ae0abd1a-d13c-4cc9-b401-6022ce717e69") : secret "insights-runtime-extractor-tls" not found Apr 20 20:13:22.434368 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:22.434323 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:22.434973 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:22.434502 2580 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 20:13:22.434973 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:22.434597 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls podName:ae0abd1a-d13c-4cc9-b401-6022ce717e69 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:30.434576052 +0000 UTC m=+142.189946562 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wv5kz" (UID: "ae0abd1a-d13c-4cc9-b401-6022ce717e69") : secret "insights-runtime-extractor-tls" not found Apr 20 20:13:30.488725 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:30.488683 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:30.491084 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:30.491063 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ae0abd1a-d13c-4cc9-b401-6022ce717e69-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wv5kz\" (UID: \"ae0abd1a-d13c-4cc9-b401-6022ce717e69\") " pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:30.565326 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:30.565301 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wv5kz" Apr 20 20:13:30.682073 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:30.681972 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wv5kz"] Apr 20 20:13:30.684964 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:13:30.684920 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae0abd1a_d13c_4cc9_b401_6022ce717e69.slice/crio-294664315e2d1e6e4c827ccda4d67ff05698ab44fda2e25ae6ffb45115c17d6d WatchSource:0}: Error finding container 294664315e2d1e6e4c827ccda4d67ff05698ab44fda2e25ae6ffb45115c17d6d: Status 404 returned error can't find the container with id 294664315e2d1e6e4c827ccda4d67ff05698ab44fda2e25ae6ffb45115c17d6d Apr 20 20:13:31.282831 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:31.282801 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wv5kz" event={"ID":"ae0abd1a-d13c-4cc9-b401-6022ce717e69","Type":"ContainerStarted","Data":"964bae0116693b24c6c49da44f1948790ae2701cedcfc681bb82250f7d300bf3"} Apr 20 20:13:31.282831 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:31.282835 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wv5kz" event={"ID":"ae0abd1a-d13c-4cc9-b401-6022ce717e69","Type":"ContainerStarted","Data":"294664315e2d1e6e4c827ccda4d67ff05698ab44fda2e25ae6ffb45115c17d6d"} Apr 20 20:13:32.287351 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:32.287314 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wv5kz" event={"ID":"ae0abd1a-d13c-4cc9-b401-6022ce717e69","Type":"ContainerStarted","Data":"5a6f213d652501a4004d29f05622d054060f64efe40cc8573abc2d7eed98e3f0"} Apr 20 20:13:33.292040 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:33.292001 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wv5kz" event={"ID":"ae0abd1a-d13c-4cc9-b401-6022ce717e69","Type":"ContainerStarted","Data":"d2a74db8e751337111c32c4ee0f57a9a4cbc446e8321a43ad6c6187bc9e5f6d7"} Apr 20 20:13:33.311707 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:33.311656 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-wv5kz" podStartSLOduration=16.911216792 podStartE2EDuration="19.311643937s" podCreationTimestamp="2026-04-20 20:13:14 +0000 UTC" firstStartedPulling="2026-04-20 20:13:30.740745196 +0000 UTC m=+142.496115704" lastFinishedPulling="2026-04-20 20:13:33.141172326 +0000 UTC m=+144.896542849" observedRunningTime="2026-04-20 20:13:33.310179349 +0000 UTC m=+145.065549880" watchObservedRunningTime="2026-04-20 20:13:33.311643937 +0000 UTC m=+145.067014468" Apr 20 20:13:37.917863 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:37.917821 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-56464c876-d69x8"] Apr 20 20:13:37.919893 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:37.919873 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:37.922636 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:37.922611 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 20:13:37.922757 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:37.922673 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 20:13:37.922757 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:37.922729 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rb8vs\"" Apr 20 20:13:37.924076 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:37.924056 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 20:13:37.929004 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:37.928986 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 20:13:37.932174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:37.932155 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56464c876-d69x8"] Apr 20 20:13:37.987109 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:37.987084 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56464c876-d69x8"] Apr 20 20:13:37.987241 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:37.987221 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[bound-sa-token ca-trust-extracted image-registry-private-configuration installation-pull-secrets kube-api-access-695fq registry-certificates registry-tls trusted-ca], unattached volumes=[], failed to process volumes=[bound-sa-token ca-trust-extracted image-registry-private-configuration installation-pull-secrets kube-api-access-695fq registry-certificates registry-tls trusted-ca]: context canceled" pod="openshift-image-registry/image-registry-56464c876-d69x8" podUID="023bef5d-5f7e-40d9-9acd-6bfddd6990b6" Apr 20 20:13:38.036463 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.036437 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-cd4b6f4f4-kcq97"] Apr 20 20:13:38.038331 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.038312 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.048116 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.048097 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-image-registry-private-configuration\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.048198 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.048124 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-registry-tls\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.048198 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.048155 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-registry-certificates\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.048273 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.048227 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-trusted-ca\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.048273 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.048263 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-bound-sa-token\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.048336 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.048286 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-ca-trust-extracted\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.048336 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.048313 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-installation-pull-secrets\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.048395 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.048348 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-695fq\" (UniqueName: \"kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-kube-api-access-695fq\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.058335 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.058314 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-cd4b6f4f4-kcq97"] Apr 20 20:13:38.148915 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.148884 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-ca-trust-extracted\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.149092 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.148924 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd3d12ef-f755-4800-bd09-5d0005e38f41-registry-tls\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.149092 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.148990 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-installation-pull-secrets\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.149092 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.149023 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cd3d12ef-f755-4800-bd09-5d0005e38f41-image-registry-private-configuration\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.149092 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.149069 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ghrb\" (UniqueName: \"kubernetes.io/projected/cd3d12ef-f755-4800-bd09-5d0005e38f41-kube-api-access-7ghrb\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.149304 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.149177 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-image-registry-private-configuration\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.149304 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.149201 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-registry-tls\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.149304 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.149218 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-bound-sa-token\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.149304 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.149236 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd3d12ef-f755-4800-bd09-5d0005e38f41-registry-certificates\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.149304 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.149259 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd3d12ef-f755-4800-bd09-5d0005e38f41-bound-sa-token\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.149304 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.149293 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-trusted-ca\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.150012 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.149338 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd3d12ef-f755-4800-bd09-5d0005e38f41-trusted-ca\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.150012 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.149372 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-695fq\" (UniqueName: \"kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-kube-api-access-695fq\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.150012 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.149396 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd3d12ef-f755-4800-bd09-5d0005e38f41-installation-pull-secrets\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.150012 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.149451 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd3d12ef-f755-4800-bd09-5d0005e38f41-ca-trust-extracted\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.150012 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.149482 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-registry-certificates\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.150264 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.150096 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-ca-trust-extracted\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.150524 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.150499 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-trusted-ca\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.150884 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.150864 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-registry-certificates\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.151704 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.151668 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-image-registry-private-configuration\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.151807 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.151775 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-registry-tls\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.151807 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.151784 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-installation-pull-secrets\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.164524 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.164496 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-695fq\" (UniqueName: \"kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-kube-api-access-695fq\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.164592 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.164569 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-bound-sa-token\") pod \"image-registry-56464c876-d69x8\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.250518 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.250496 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd3d12ef-f755-4800-bd09-5d0005e38f41-trusted-ca\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.250627 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.250536 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd3d12ef-f755-4800-bd09-5d0005e38f41-installation-pull-secrets\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.250627 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.250587 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd3d12ef-f755-4800-bd09-5d0005e38f41-ca-trust-extracted\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.250725 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.250632 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd3d12ef-f755-4800-bd09-5d0005e38f41-registry-tls\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.250932 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.250913 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cd3d12ef-f755-4800-bd09-5d0005e38f41-image-registry-private-configuration\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.251027 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.250970 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ghrb\" (UniqueName: \"kubernetes.io/projected/cd3d12ef-f755-4800-bd09-5d0005e38f41-kube-api-access-7ghrb\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.251027 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.251013 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd3d12ef-f755-4800-bd09-5d0005e38f41-registry-certificates\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.251126 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.251026 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd3d12ef-f755-4800-bd09-5d0005e38f41-ca-trust-extracted\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.251126 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.251050 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd3d12ef-f755-4800-bd09-5d0005e38f41-bound-sa-token\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.251720 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.251692 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd3d12ef-f755-4800-bd09-5d0005e38f41-trusted-ca\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.251809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.251696 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd3d12ef-f755-4800-bd09-5d0005e38f41-registry-certificates\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.253092 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.253067 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd3d12ef-f755-4800-bd09-5d0005e38f41-installation-pull-secrets\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.253178 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.253144 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd3d12ef-f755-4800-bd09-5d0005e38f41-registry-tls\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.253243 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.253225 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cd3d12ef-f755-4800-bd09-5d0005e38f41-image-registry-private-configuration\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.261675 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.261651 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd3d12ef-f755-4800-bd09-5d0005e38f41-bound-sa-token\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.262126 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.262108 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ghrb\" (UniqueName: \"kubernetes.io/projected/cd3d12ef-f755-4800-bd09-5d0005e38f41-kube-api-access-7ghrb\") pod \"image-registry-cd4b6f4f4-kcq97\" (UID: \"cd3d12ef-f755-4800-bd09-5d0005e38f41\") " pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.304978 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.304942 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.308998 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.308983 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:38.347000 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.346976 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:38.452745 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.452719 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-bound-sa-token\") pod \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " Apr 20 20:13:38.452905 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.452754 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-registry-tls\") pod \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " Apr 20 20:13:38.452905 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.452804 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-695fq\" (UniqueName: \"kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-kube-api-access-695fq\") pod \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " Apr 20 20:13:38.452905 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.452834 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-image-registry-private-configuration\") pod \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " Apr 20 20:13:38.452905 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.452879 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-registry-certificates\") pod \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " Apr 20 20:13:38.452905 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.452904 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-trusted-ca\") pod \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " Apr 20 20:13:38.453200 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.452933 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-ca-trust-extracted\") pod \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " Apr 20 20:13:38.453200 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.452980 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-installation-pull-secrets\") pod \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\" (UID: \"023bef5d-5f7e-40d9-9acd-6bfddd6990b6\") " Apr 20 20:13:38.453294 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.453265 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "023bef5d-5f7e-40d9-9acd-6bfddd6990b6" (UID: "023bef5d-5f7e-40d9-9acd-6bfddd6990b6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:13:38.453687 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.453539 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "023bef5d-5f7e-40d9-9acd-6bfddd6990b6" (UID: "023bef5d-5f7e-40d9-9acd-6bfddd6990b6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:13:38.453945 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.453924 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "023bef5d-5f7e-40d9-9acd-6bfddd6990b6" (UID: "023bef5d-5f7e-40d9-9acd-6bfddd6990b6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:13:38.455137 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.455105 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "023bef5d-5f7e-40d9-9acd-6bfddd6990b6" (UID: "023bef5d-5f7e-40d9-9acd-6bfddd6990b6"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:13:38.455137 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.455116 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "023bef5d-5f7e-40d9-9acd-6bfddd6990b6" (UID: "023bef5d-5f7e-40d9-9acd-6bfddd6990b6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:13:38.455263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.455131 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-kube-api-access-695fq" (OuterVolumeSpecName: "kube-api-access-695fq") pod "023bef5d-5f7e-40d9-9acd-6bfddd6990b6" (UID: "023bef5d-5f7e-40d9-9acd-6bfddd6990b6"). InnerVolumeSpecName "kube-api-access-695fq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:13:38.455556 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.455538 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "023bef5d-5f7e-40d9-9acd-6bfddd6990b6" (UID: "023bef5d-5f7e-40d9-9acd-6bfddd6990b6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:13:38.455608 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.455595 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "023bef5d-5f7e-40d9-9acd-6bfddd6990b6" (UID: "023bef5d-5f7e-40d9-9acd-6bfddd6990b6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:13:38.460527 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.460509 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-cd4b6f4f4-kcq97"] Apr 20 20:13:38.462890 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:13:38.462867 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd3d12ef_f755_4800_bd09_5d0005e38f41.slice/crio-186fca1bd9dc285fab02c1fbfc05f0d9d375e919f6888e769d8d604c50adb521 WatchSource:0}: Error finding container 186fca1bd9dc285fab02c1fbfc05f0d9d375e919f6888e769d8d604c50adb521: Status 404 returned error can't find the container with id 186fca1bd9dc285fab02c1fbfc05f0d9d375e919f6888e769d8d604c50adb521 Apr 20 20:13:38.554374 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.554341 2580 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-image-registry-private-configuration\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:13:38.554374 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.554376 2580 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-registry-certificates\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:13:38.554527 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.554392 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-trusted-ca\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:13:38.554527 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.554407 2580 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-ca-trust-extracted\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:13:38.554527 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.554424 2580 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-installation-pull-secrets\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:13:38.554527 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.554438 2580 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-bound-sa-token\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:13:38.554527 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.554452 2580 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-registry-tls\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:13:38.554527 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:38.554468 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-695fq\" (UniqueName: \"kubernetes.io/projected/023bef5d-5f7e-40d9-9acd-6bfddd6990b6-kube-api-access-695fq\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:13:39.074687 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:39.074634 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" podUID="8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 20:13:39.308199 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:39.308164 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" event={"ID":"cd3d12ef-f755-4800-bd09-5d0005e38f41","Type":"ContainerStarted","Data":"d95062cb948bd937aad293cc2e9f2f023bff9f62a76d6fe40a19ce5828e03571"} Apr 20 20:13:39.308199 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:39.308200 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" event={"ID":"cd3d12ef-f755-4800-bd09-5d0005e38f41","Type":"ContainerStarted","Data":"186fca1bd9dc285fab02c1fbfc05f0d9d375e919f6888e769d8d604c50adb521"} Apr 20 20:13:39.308199 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:39.308171 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56464c876-d69x8" Apr 20 20:13:39.308529 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:39.308499 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:13:39.331007 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:39.330943 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" podStartSLOduration=1.330931638 podStartE2EDuration="1.330931638s" podCreationTimestamp="2026-04-20 20:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:13:39.329928996 +0000 UTC m=+151.085299522" watchObservedRunningTime="2026-04-20 20:13:39.330931638 +0000 UTC m=+151.086302167" Apr 20 20:13:39.360080 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:39.360056 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56464c876-d69x8"] Apr 20 20:13:39.364277 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:39.364256 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-56464c876-d69x8"] Apr 20 20:13:40.823580 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:40.823548 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="023bef5d-5f7e-40d9-9acd-6bfddd6990b6" path="/var/lib/kubelet/pods/023bef5d-5f7e-40d9-9acd-6bfddd6990b6/volumes" Apr 20 20:13:45.121133 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:45.121090 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6b4d9" podUID="7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47" Apr 20 20:13:45.127587 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:45.127560 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-cxb87" podUID="ba7ed180-9b68-40c1-9f30-e9a6a5c96af3" Apr 20 20:13:45.321341 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:45.321311 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cxb87" Apr 20 20:13:45.321496 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:45.321317 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6b4d9" Apr 20 20:13:45.840133 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:45.840052 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-z9tzr" podUID="ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9" Apr 20 20:13:46.837713 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.837648 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-gcgh7"] Apr 20 20:13:46.843121 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.843103 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:46.847309 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.847285 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 20:13:46.847520 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.847296 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 20:13:46.847520 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.847298 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 20:13:46.847649 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.847296 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 20:13:46.847649 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.847298 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 20:13:46.847726 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.847310 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 20:13:46.847726 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.847323 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-dhzsm\"" Apr 20 20:13:46.853420 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.853401 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-gcgh7"] Apr 20 20:13:46.854116 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.854097 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-f7x6t"] Apr 20 20:13:46.858086 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.858068 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:46.860469 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.860448 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 20:13:46.860712 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.860692 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 20:13:46.860803 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.860780 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-c8f4q\"" Apr 20 20:13:46.860859 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.860841 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 20:13:46.916652 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.916627 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ec3754cb-191c-482e-bbf0-218c49b41734-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:46.916760 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.916656 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q22c8\" (UniqueName: \"kubernetes.io/projected/ec3754cb-191c-482e-bbf0-218c49b41734-kube-api-access-q22c8\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:46.916760 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.916689 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec3754cb-191c-482e-bbf0-218c49b41734-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:46.916830 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.916754 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ec3754cb-191c-482e-bbf0-218c49b41734-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:46.916830 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.916777 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec3754cb-191c-482e-bbf0-218c49b41734-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:46.916830 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:46.916824 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ec3754cb-191c-482e-bbf0-218c49b41734-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:47.017485 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.017459 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f7852481-fc7c-425e-ab75-dc92a22dd20c-root\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.017585 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.017489 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f7852481-fc7c-425e-ab75-dc92a22dd20c-sys\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.017585 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.017508 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f7852481-fc7c-425e-ab75-dc92a22dd20c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.017585 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.017537 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f7852481-fc7c-425e-ab75-dc92a22dd20c-node-exporter-textfile\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.017585 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.017563 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f7852481-fc7c-425e-ab75-dc92a22dd20c-node-exporter-tls\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.017712 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.017586 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f7852481-fc7c-425e-ab75-dc92a22dd20c-node-exporter-accelerators-collector-config\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.017712 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.017671 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f7852481-fc7c-425e-ab75-dc92a22dd20c-node-exporter-wtmp\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.017712 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.017699 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7852481-fc7c-425e-ab75-dc92a22dd20c-metrics-client-ca\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.017809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.017721 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ec3754cb-191c-482e-bbf0-218c49b41734-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:47.017809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.017741 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q22c8\" (UniqueName: \"kubernetes.io/projected/ec3754cb-191c-482e-bbf0-218c49b41734-kube-api-access-q22c8\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:47.017870 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.017805 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec3754cb-191c-482e-bbf0-218c49b41734-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:47.017870 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.017834 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvj7w\" (UniqueName: \"kubernetes.io/projected/f7852481-fc7c-425e-ab75-dc92a22dd20c-kube-api-access-fvj7w\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.017978 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.017867 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ec3754cb-191c-482e-bbf0-218c49b41734-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:47.017978 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.017892 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec3754cb-191c-482e-bbf0-218c49b41734-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:47.017978 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.017921 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ec3754cb-191c-482e-bbf0-218c49b41734-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:47.018361 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.018336 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ec3754cb-191c-482e-bbf0-218c49b41734-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:47.018634 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.018591 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec3754cb-191c-482e-bbf0-218c49b41734-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:47.018634 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.018604 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ec3754cb-191c-482e-bbf0-218c49b41734-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:47.020261 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.020236 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec3754cb-191c-482e-bbf0-218c49b41734-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:47.020357 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.020344 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ec3754cb-191c-482e-bbf0-218c49b41734-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:47.026755 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.026733 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q22c8\" (UniqueName: \"kubernetes.io/projected/ec3754cb-191c-482e-bbf0-218c49b41734-kube-api-access-q22c8\") pod \"kube-state-metrics-69db897b98-gcgh7\" (UID: \"ec3754cb-191c-482e-bbf0-218c49b41734\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:47.118504 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.118427 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvj7w\" (UniqueName: \"kubernetes.io/projected/f7852481-fc7c-425e-ab75-dc92a22dd20c-kube-api-access-fvj7w\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.118504 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.118471 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f7852481-fc7c-425e-ab75-dc92a22dd20c-root\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.118504 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.118496 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f7852481-fc7c-425e-ab75-dc92a22dd20c-sys\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.118743 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.118523 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f7852481-fc7c-425e-ab75-dc92a22dd20c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.118743 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.118541 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f7852481-fc7c-425e-ab75-dc92a22dd20c-root\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.118743 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.118555 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f7852481-fc7c-425e-ab75-dc92a22dd20c-node-exporter-textfile\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.118743 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.118593 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f7852481-fc7c-425e-ab75-dc92a22dd20c-sys\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.118743 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.118619 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f7852481-fc7c-425e-ab75-dc92a22dd20c-node-exporter-tls\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.118743 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.118655 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f7852481-fc7c-425e-ab75-dc92a22dd20c-node-exporter-accelerators-collector-config\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.119050 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.118811 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f7852481-fc7c-425e-ab75-dc92a22dd20c-node-exporter-wtmp\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.119050 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.118847 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7852481-fc7c-425e-ab75-dc92a22dd20c-metrics-client-ca\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.119050 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.118910 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f7852481-fc7c-425e-ab75-dc92a22dd20c-node-exporter-textfile\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.119050 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.119003 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f7852481-fc7c-425e-ab75-dc92a22dd20c-node-exporter-wtmp\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.119249 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.119236 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f7852481-fc7c-425e-ab75-dc92a22dd20c-node-exporter-accelerators-collector-config\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.119475 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.119324 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7852481-fc7c-425e-ab75-dc92a22dd20c-metrics-client-ca\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.121050 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.121022 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f7852481-fc7c-425e-ab75-dc92a22dd20c-node-exporter-tls\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.121165 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.121051 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f7852481-fc7c-425e-ab75-dc92a22dd20c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.126476 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.126445 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvj7w\" (UniqueName: \"kubernetes.io/projected/f7852481-fc7c-425e-ab75-dc92a22dd20c-kube-api-access-fvj7w\") pod \"node-exporter-f7x6t\" (UID: \"f7852481-fc7c-425e-ab75-dc92a22dd20c\") " pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.152369 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.152351 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" Apr 20 20:13:47.166193 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.166175 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-f7x6t" Apr 20 20:13:47.175597 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:13:47.175574 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7852481_fc7c_425e_ab75_dc92a22dd20c.slice/crio-0e4ff0f267636008a3d0b433c9a9c279d0c2c2ac3ff557431919c1553efbc92d WatchSource:0}: Error finding container 0e4ff0f267636008a3d0b433c9a9c279d0c2c2ac3ff557431919c1553efbc92d: Status 404 returned error can't find the container with id 0e4ff0f267636008a3d0b433c9a9c279d0c2c2ac3ff557431919c1553efbc92d Apr 20 20:13:47.270156 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.270129 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-gcgh7"] Apr 20 20:13:47.273411 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:13:47.273373 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec3754cb_191c_482e_bbf0_218c49b41734.slice/crio-3745a2c1b0a783bccbdabbc4277ead48a8c6c60c0f43b7a0ce48bfa813b68b18 WatchSource:0}: Error finding container 3745a2c1b0a783bccbdabbc4277ead48a8c6c60c0f43b7a0ce48bfa813b68b18: Status 404 returned error can't find the container with id 3745a2c1b0a783bccbdabbc4277ead48a8c6c60c0f43b7a0ce48bfa813b68b18 Apr 20 20:13:47.327438 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.327403 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f7x6t" event={"ID":"f7852481-fc7c-425e-ab75-dc92a22dd20c","Type":"ContainerStarted","Data":"0e4ff0f267636008a3d0b433c9a9c279d0c2c2ac3ff557431919c1553efbc92d"} Apr 20 20:13:47.328402 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.328373 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" event={"ID":"ec3754cb-191c-482e-bbf0-218c49b41734","Type":"ContainerStarted","Data":"3745a2c1b0a783bccbdabbc4277ead48a8c6c60c0f43b7a0ce48bfa813b68b18"} Apr 20 20:13:47.925556 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.925530 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:13:47.929935 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.929913 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:47.932716 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.932693 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 20:13:47.932716 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.932706 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 20:13:47.932886 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.932737 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 20:13:47.932886 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.932701 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 20:13:47.932886 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.932696 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 20:13:47.932886 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.932871 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 20:13:47.933150 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.933126 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 20:13:47.933215 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.933160 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 20:13:47.933273 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.933229 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 20:13:47.933273 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.933246 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-nwd95\"" Apr 20 20:13:47.941518 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:47.941500 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:13:48.025585 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.025556 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-config-volume\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.025744 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.025603 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9vpj\" (UniqueName: \"kubernetes.io/projected/c156e30d-3f28-45a6-b7eb-2e01a40bda41-kube-api-access-n9vpj\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.025744 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.025642 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c156e30d-3f28-45a6-b7eb-2e01a40bda41-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.025744 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.025667 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.025744 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.025698 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.026032 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.025775 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c156e30d-3f28-45a6-b7eb-2e01a40bda41-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.026032 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.025801 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.026032 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.025834 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c156e30d-3f28-45a6-b7eb-2e01a40bda41-config-out\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.026032 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.025904 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c156e30d-3f28-45a6-b7eb-2e01a40bda41-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.026032 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.025942 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c156e30d-3f28-45a6-b7eb-2e01a40bda41-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.026032 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.026007 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-web-config\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.026480 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.026038 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.026480 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.026055 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.127287 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.127252 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-config-volume\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.127423 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.127306 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9vpj\" (UniqueName: \"kubernetes.io/projected/c156e30d-3f28-45a6-b7eb-2e01a40bda41-kube-api-access-n9vpj\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.127423 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.127334 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c156e30d-3f28-45a6-b7eb-2e01a40bda41-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.127423 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.127351 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.127423 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.127381 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.127606 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.127426 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c156e30d-3f28-45a6-b7eb-2e01a40bda41-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.127606 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.127449 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.127606 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.127531 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c156e30d-3f28-45a6-b7eb-2e01a40bda41-config-out\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.127606 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.127581 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c156e30d-3f28-45a6-b7eb-2e01a40bda41-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.127790 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.127649 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c156e30d-3f28-45a6-b7eb-2e01a40bda41-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.127790 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.127683 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-web-config\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.128470 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.128066 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c156e30d-3f28-45a6-b7eb-2e01a40bda41-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.128470 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.128129 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.128470 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.128163 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.128470 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.128242 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c156e30d-3f28-45a6-b7eb-2e01a40bda41-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.128743 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.128711 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c156e30d-3f28-45a6-b7eb-2e01a40bda41-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.129313 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:48.128809 2580 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 20 20:13:48.129313 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:13:48.128879 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-main-tls podName:c156e30d-3f28-45a6-b7eb-2e01a40bda41 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:48.628862241 +0000 UTC m=+160.384232750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "c156e30d-3f28-45a6-b7eb-2e01a40bda41") : secret "alertmanager-main-tls" not found Apr 20 20:13:48.130662 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.130594 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.130755 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.130697 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-config-volume\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.130850 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.130831 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-web-config\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.131587 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.131564 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c156e30d-3f28-45a6-b7eb-2e01a40bda41-config-out\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.131682 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.131585 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.132158 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.132140 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.132629 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.132606 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c156e30d-3f28-45a6-b7eb-2e01a40bda41-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.132741 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.132728 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.138134 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.138086 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9vpj\" (UniqueName: \"kubernetes.io/projected/c156e30d-3f28-45a6-b7eb-2e01a40bda41-kube-api-access-n9vpj\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.334859 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.334815 2580 generic.go:358] "Generic (PLEG): container finished" podID="f7852481-fc7c-425e-ab75-dc92a22dd20c" containerID="086f686ea687bc9047bfca7b93c8008c7fac2e0710ff6e2e95419ba0e6dc3de3" exitCode=0 Apr 20 20:13:48.335149 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.334887 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f7x6t" event={"ID":"f7852481-fc7c-425e-ab75-dc92a22dd20c","Type":"ContainerDied","Data":"086f686ea687bc9047bfca7b93c8008c7fac2e0710ff6e2e95419ba0e6dc3de3"} Apr 20 20:13:48.632462 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.632431 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.635311 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.635285 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:48.839104 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:48.839069 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:13:49.004970 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:49.004922 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:13:49.008063 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:13:49.008037 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc156e30d_3f28_45a6_b7eb_2e01a40bda41.slice/crio-9c359448af4dfef69a1df89aec19b9f9c3ade3db02c4ad4afd5bbd0b41eed6eb WatchSource:0}: Error finding container 9c359448af4dfef69a1df89aec19b9f9c3ade3db02c4ad4afd5bbd0b41eed6eb: Status 404 returned error can't find the container with id 9c359448af4dfef69a1df89aec19b9f9c3ade3db02c4ad4afd5bbd0b41eed6eb Apr 20 20:13:49.074364 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:49.074330 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" podUID="8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 20:13:49.339671 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:49.339640 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f7x6t" event={"ID":"f7852481-fc7c-425e-ab75-dc92a22dd20c","Type":"ContainerStarted","Data":"9d7072beb286b5f8487a7015b03befa6f144b61514a1f6983df6204ee2366024"} Apr 20 20:13:49.339671 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:49.339673 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f7x6t" event={"ID":"f7852481-fc7c-425e-ab75-dc92a22dd20c","Type":"ContainerStarted","Data":"5c214a74b2479c0cdb750713a9d45c1aca364932b9448443bd727cbb8a964fa4"} Apr 20 20:13:49.341480 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:49.341455 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" event={"ID":"ec3754cb-191c-482e-bbf0-218c49b41734","Type":"ContainerStarted","Data":"82eca3bb901144a413dd20243454b526d3d9694ebcfed10e71615561a9cc5415"} Apr 20 20:13:49.341611 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:49.341487 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" event={"ID":"ec3754cb-191c-482e-bbf0-218c49b41734","Type":"ContainerStarted","Data":"84e6fe0e73eed67beeb975c26d30fcac284774f7332b555ae8556c63c5524187"} Apr 20 20:13:49.341611 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:49.341501 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" event={"ID":"ec3754cb-191c-482e-bbf0-218c49b41734","Type":"ContainerStarted","Data":"ba59173c456785127e19222c1eb261c4770f7f3bb87ced92360970518aa48df2"} Apr 20 20:13:49.342440 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:49.342419 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c156e30d-3f28-45a6-b7eb-2e01a40bda41","Type":"ContainerStarted","Data":"9c359448af4dfef69a1df89aec19b9f9c3ade3db02c4ad4afd5bbd0b41eed6eb"} Apr 20 20:13:49.356847 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:49.356808 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-f7x6t" podStartSLOduration=2.645841172 podStartE2EDuration="3.356797006s" podCreationTimestamp="2026-04-20 20:13:46 +0000 UTC" firstStartedPulling="2026-04-20 20:13:47.177251748 +0000 UTC m=+158.932622257" lastFinishedPulling="2026-04-20 20:13:47.888207581 +0000 UTC m=+159.643578091" observedRunningTime="2026-04-20 20:13:49.356183965 +0000 UTC m=+161.111554519" watchObservedRunningTime="2026-04-20 20:13:49.356797006 +0000 UTC m=+161.112167570" Apr 20 20:13:49.375542 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:49.375507 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-gcgh7" podStartSLOduration=1.933025542 podStartE2EDuration="3.375495176s" podCreationTimestamp="2026-04-20 20:13:46 +0000 UTC" firstStartedPulling="2026-04-20 20:13:47.275220554 +0000 UTC m=+159.030591063" lastFinishedPulling="2026-04-20 20:13:48.717690188 +0000 UTC m=+160.473060697" observedRunningTime="2026-04-20 20:13:49.374813503 +0000 UTC m=+161.130184033" watchObservedRunningTime="2026-04-20 20:13:49.375495176 +0000 UTC m=+161.130865706" Apr 20 20:13:50.047164 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:50.047118 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert\") pod \"ingress-canary-cxb87\" (UID: \"ba7ed180-9b68-40c1-9f30-e9a6a5c96af3\") " pod="openshift-ingress-canary/ingress-canary-cxb87" Apr 20 20:13:50.047579 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:50.047193 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:13:50.049625 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:50.049597 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47-metrics-tls\") pod \"dns-default-6b4d9\" (UID: \"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47\") " pod="openshift-dns/dns-default-6b4d9" Apr 20 20:13:50.049728 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:50.049644 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba7ed180-9b68-40c1-9f30-e9a6a5c96af3-cert\") pod \"ingress-canary-cxb87\" (UID: \"ba7ed180-9b68-40c1-9f30-e9a6a5c96af3\") " pod="openshift-ingress-canary/ingress-canary-cxb87" Apr 20 20:13:50.124756 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:50.124723 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rfvs6\"" Apr 20 20:13:50.126082 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:50.126059 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bkrnz\"" Apr 20 20:13:50.132318 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:50.132295 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6b4d9" Apr 20 20:13:50.132318 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:50.132310 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cxb87" Apr 20 20:13:50.425020 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:50.424998 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6b4d9"] Apr 20 20:13:50.427135 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:13:50.427090 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c6ab6ed_0ff9_43e5_a89d_3b8dc9b75f47.slice/crio-d3eb7d462d7e7ae1e5eaaf354ffc9e7ed4ab76caf3a49e5bd5d7e7044d20e3d0 WatchSource:0}: Error finding container d3eb7d462d7e7ae1e5eaaf354ffc9e7ed4ab76caf3a49e5bd5d7e7044d20e3d0: Status 404 returned error can't find the container with id d3eb7d462d7e7ae1e5eaaf354ffc9e7ed4ab76caf3a49e5bd5d7e7044d20e3d0 Apr 20 20:13:50.441225 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:50.441204 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cxb87"] Apr 20 20:13:50.443692 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:13:50.443670 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba7ed180_9b68_40c1_9f30_e9a6a5c96af3.slice/crio-48c593fcf1e85a60ef30b2c12e3f30e8fa4af6913f12678d46d2e402d244e81c WatchSource:0}: Error finding container 48c593fcf1e85a60ef30b2c12e3f30e8fa4af6913f12678d46d2e402d244e81c: Status 404 returned error can't find the container with id 48c593fcf1e85a60ef30b2c12e3f30e8fa4af6913f12678d46d2e402d244e81c Apr 20 20:13:51.123817 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.123783 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw"] Apr 20 20:13:51.127282 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.127255 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.130610 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.130168 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 20:13:51.130610 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.130425 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 20:13:51.130610 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.130564 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 20:13:51.131573 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.131482 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 20:13:51.131683 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.131630 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-qdzcq\"" Apr 20 20:13:51.131683 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.131643 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-b89u7mv85upsa\"" Apr 20 20:13:51.134003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.133984 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw"] Apr 20 20:13:51.260714 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.260678 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.260931 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.260727 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-audit-log\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.260931 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.260757 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-metrics-server-audit-profiles\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.260931 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.260821 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-secret-metrics-server-client-certs\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.260931 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.260889 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-client-ca-bundle\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.260931 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.260919 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr7sm\" (UniqueName: \"kubernetes.io/projected/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-kube-api-access-xr7sm\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.261265 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.261013 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-secret-metrics-server-tls\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.348786 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.348755 2580 generic.go:358] "Generic (PLEG): container finished" podID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerID="20a6c8e7d71a10b325177d05a48a5a4e0d3c339ecb8ff2dbedd22ee1dff4ac27" exitCode=0 Apr 20 20:13:51.348943 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.348788 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c156e30d-3f28-45a6-b7eb-2e01a40bda41","Type":"ContainerDied","Data":"20a6c8e7d71a10b325177d05a48a5a4e0d3c339ecb8ff2dbedd22ee1dff4ac27"} Apr 20 20:13:51.351021 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.350994 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6b4d9" event={"ID":"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47","Type":"ContainerStarted","Data":"d3eb7d462d7e7ae1e5eaaf354ffc9e7ed4ab76caf3a49e5bd5d7e7044d20e3d0"} Apr 20 20:13:51.352278 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.352254 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cxb87" event={"ID":"ba7ed180-9b68-40c1-9f30-e9a6a5c96af3","Type":"ContainerStarted","Data":"48c593fcf1e85a60ef30b2c12e3f30e8fa4af6913f12678d46d2e402d244e81c"} Apr 20 20:13:51.362093 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.362063 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.362192 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.362123 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-audit-log\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.362192 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.362154 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-metrics-server-audit-profiles\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.362305 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.362191 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-secret-metrics-server-client-certs\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.362305 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.362245 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-client-ca-bundle\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.362305 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.362274 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xr7sm\" (UniqueName: \"kubernetes.io/projected/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-kube-api-access-xr7sm\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.362440 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.362309 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-secret-metrics-server-tls\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.362832 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.362810 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.363264 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.363223 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-audit-log\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.363817 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.363793 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-metrics-server-audit-profiles\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.366021 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.365978 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-client-ca-bundle\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.366021 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.365995 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-secret-metrics-server-tls\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.366181 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.366159 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-secret-metrics-server-client-certs\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.371685 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.371664 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr7sm\" (UniqueName: \"kubernetes.io/projected/8f710e3e-d3cc-474f-ba2d-1bebaec01dcf-kube-api-access-xr7sm\") pod \"metrics-server-5fb54d5cbb-nm9lw\" (UID: \"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf\") " pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.395128 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.395058 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-866497797c-4vsdx"] Apr 20 20:13:51.399439 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.399411 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.402394 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.402251 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 20:13:51.402394 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.402310 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-m57vs\"" Apr 20 20:13:51.402394 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.402347 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 20:13:51.402675 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.402459 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 20:13:51.402675 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.402485 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 20:13:51.402675 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.402486 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 20:13:51.402675 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.402578 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 20:13:51.402675 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.402622 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 20:13:51.408868 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.408328 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 20:13:51.409963 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.409907 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-866497797c-4vsdx"] Apr 20 20:13:51.439907 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.439883 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:13:51.565372 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.565338 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-oauth-serving-cert\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.565532 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.565389 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7d5e596-f210-474b-935a-c90aefb9063f-console-oauth-config\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.565532 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.565509 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-trusted-ca-bundle\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.565648 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.565540 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-service-ca\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.565648 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.565574 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6vgd\" (UniqueName: \"kubernetes.io/projected/c7d5e596-f210-474b-935a-c90aefb9063f-kube-api-access-d6vgd\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.565749 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.565663 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d5e596-f210-474b-935a-c90aefb9063f-console-serving-cert\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.565749 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.565704 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-console-config\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.578745 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.578717 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw"] Apr 20 20:13:51.673862 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.673775 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-trusted-ca-bundle\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.673862 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.673833 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-service-ca\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.674135 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.673868 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6vgd\" (UniqueName: \"kubernetes.io/projected/c7d5e596-f210-474b-935a-c90aefb9063f-kube-api-access-d6vgd\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.674135 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.673945 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d5e596-f210-474b-935a-c90aefb9063f-console-serving-cert\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.674135 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.673992 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-console-config\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.674135 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.674023 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-oauth-serving-cert\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.674135 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.674050 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7d5e596-f210-474b-935a-c90aefb9063f-console-oauth-config\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.674701 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.674676 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-service-ca\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.675427 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.675403 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-oauth-serving-cert\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.675427 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.675410 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-console-config\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.675994 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.675970 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-trusted-ca-bundle\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.677054 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.677031 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7d5e596-f210-474b-935a-c90aefb9063f-console-oauth-config\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.677271 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.677132 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d5e596-f210-474b-935a-c90aefb9063f-console-serving-cert\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.681800 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.681776 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6vgd\" (UniqueName: \"kubernetes.io/projected/c7d5e596-f210-474b-935a-c90aefb9063f-kube-api-access-d6vgd\") pod \"console-866497797c-4vsdx\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.711798 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:51.711772 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:13:51.917240 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:13:51.917195 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f710e3e_d3cc_474f_ba2d_1bebaec01dcf.slice/crio-362e26ef95296f6ebde2cce0e668de69256b15a55c999c8f20c8f29fc7bf4c5b WatchSource:0}: Error finding container 362e26ef95296f6ebde2cce0e668de69256b15a55c999c8f20c8f29fc7bf4c5b: Status 404 returned error can't find the container with id 362e26ef95296f6ebde2cce0e668de69256b15a55c999c8f20c8f29fc7bf4c5b Apr 20 20:13:52.024538 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.024501 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5997bd7fc9-n72rb"] Apr 20 20:13:52.028196 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.028169 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.031356 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.031200 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 20:13:52.031356 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.031209 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 20:13:52.031356 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.031278 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 20:13:52.031688 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.031622 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 20:13:52.031688 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.031660 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 20:13:52.031795 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.031704 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-dbmfm\"" Apr 20 20:13:52.037509 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.037486 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 20:13:52.040636 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.040590 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5997bd7fc9-n72rb"] Apr 20 20:13:52.179493 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.179453 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/505bd79a-a9c9-46d3-aea7-c25765078ece-secret-telemeter-client\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.179917 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.179501 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxqr8\" (UniqueName: \"kubernetes.io/projected/505bd79a-a9c9-46d3-aea7-c25765078ece-kube-api-access-vxqr8\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.179917 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.179553 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/505bd79a-a9c9-46d3-aea7-c25765078ece-federate-client-tls\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.179917 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.179616 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/505bd79a-a9c9-46d3-aea7-c25765078ece-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.179917 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.179655 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/505bd79a-a9c9-46d3-aea7-c25765078ece-metrics-client-ca\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.179917 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.179690 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/505bd79a-a9c9-46d3-aea7-c25765078ece-serving-certs-ca-bundle\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.179917 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.179716 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/505bd79a-a9c9-46d3-aea7-c25765078ece-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.179917 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.179778 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/505bd79a-a9c9-46d3-aea7-c25765078ece-telemeter-client-tls\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.281258 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.281172 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/505bd79a-a9c9-46d3-aea7-c25765078ece-metrics-client-ca\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.281258 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.281235 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/505bd79a-a9c9-46d3-aea7-c25765078ece-serving-certs-ca-bundle\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.281471 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.281270 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/505bd79a-a9c9-46d3-aea7-c25765078ece-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.281471 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.281309 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/505bd79a-a9c9-46d3-aea7-c25765078ece-telemeter-client-tls\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.281471 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.281369 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/505bd79a-a9c9-46d3-aea7-c25765078ece-secret-telemeter-client\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.281471 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.281398 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxqr8\" (UniqueName: \"kubernetes.io/projected/505bd79a-a9c9-46d3-aea7-c25765078ece-kube-api-access-vxqr8\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.281471 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.281430 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/505bd79a-a9c9-46d3-aea7-c25765078ece-federate-client-tls\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.281715 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.281497 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/505bd79a-a9c9-46d3-aea7-c25765078ece-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.281979 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.281889 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/505bd79a-a9c9-46d3-aea7-c25765078ece-metrics-client-ca\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.282073 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.282021 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/505bd79a-a9c9-46d3-aea7-c25765078ece-serving-certs-ca-bundle\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.282513 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.282439 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/505bd79a-a9c9-46d3-aea7-c25765078ece-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.284562 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.284539 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/505bd79a-a9c9-46d3-aea7-c25765078ece-secret-telemeter-client\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.284847 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.284821 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/505bd79a-a9c9-46d3-aea7-c25765078ece-telemeter-client-tls\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.285437 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.285412 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/505bd79a-a9c9-46d3-aea7-c25765078ece-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.285526 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.285490 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/505bd79a-a9c9-46d3-aea7-c25765078ece-federate-client-tls\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.289600 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.289571 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxqr8\" (UniqueName: \"kubernetes.io/projected/505bd79a-a9c9-46d3-aea7-c25765078ece-kube-api-access-vxqr8\") pod \"telemeter-client-5997bd7fc9-n72rb\" (UID: \"505bd79a-a9c9-46d3-aea7-c25765078ece\") " pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.341646 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.341612 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" Apr 20 20:13:52.356917 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.356882 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" event={"ID":"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf","Type":"ContainerStarted","Data":"362e26ef95296f6ebde2cce0e668de69256b15a55c999c8f20c8f29fc7bf4c5b"} Apr 20 20:13:52.653770 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.653726 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5997bd7fc9-n72rb"] Apr 20 20:13:52.658336 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:13:52.658030 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod505bd79a_a9c9_46d3_aea7_c25765078ece.slice/crio-ad3deb662e4bec22f9e5b57b9132a59febe636486f781a9669e5bba4b7f2779a WatchSource:0}: Error finding container ad3deb662e4bec22f9e5b57b9132a59febe636486f781a9669e5bba4b7f2779a: Status 404 returned error can't find the container with id ad3deb662e4bec22f9e5b57b9132a59febe636486f781a9669e5bba4b7f2779a Apr 20 20:13:52.680361 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:52.678367 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-866497797c-4vsdx"] Apr 20 20:13:52.681082 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:13:52.681051 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d5e596_f210_474b_935a_c90aefb9063f.slice/crio-32e0e2aff131cd70b67be43fb686dd6013c77e8db077219e07374046c60198b9 WatchSource:0}: Error finding container 32e0e2aff131cd70b67be43fb686dd6013c77e8db077219e07374046c60198b9: Status 404 returned error can't find the container with id 32e0e2aff131cd70b67be43fb686dd6013c77e8db077219e07374046c60198b9 Apr 20 20:13:53.362229 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:53.362172 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6b4d9" event={"ID":"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47","Type":"ContainerStarted","Data":"2b79647393afbb42c4727f638b5f858fe049a5c69043112284348494614b13dc"} Apr 20 20:13:53.362229 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:53.362219 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6b4d9" event={"ID":"7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47","Type":"ContainerStarted","Data":"1869e076ec900755a3e7306e990d75956726ea282d152521b1d3991b5926f593"} Apr 20 20:13:53.362712 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:53.362303 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6b4d9" Apr 20 20:13:53.363448 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:53.363409 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" event={"ID":"505bd79a-a9c9-46d3-aea7-c25765078ece","Type":"ContainerStarted","Data":"ad3deb662e4bec22f9e5b57b9132a59febe636486f781a9669e5bba4b7f2779a"} Apr 20 20:13:53.364839 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:53.364816 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cxb87" event={"ID":"ba7ed180-9b68-40c1-9f30-e9a6a5c96af3","Type":"ContainerStarted","Data":"42100ddfb0f5a782af4dffce214ac71f057bcee669f7bf5df21a5904754d462f"} Apr 20 20:13:53.366059 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:53.366039 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-866497797c-4vsdx" event={"ID":"c7d5e596-f210-474b-935a-c90aefb9063f","Type":"ContainerStarted","Data":"32e0e2aff131cd70b67be43fb686dd6013c77e8db077219e07374046c60198b9"} Apr 20 20:13:53.379744 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:53.379698 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6b4d9" podStartSLOduration=129.29673486 podStartE2EDuration="2m11.379686583s" podCreationTimestamp="2026-04-20 20:11:42 +0000 UTC" firstStartedPulling="2026-04-20 20:13:50.429215017 +0000 UTC m=+162.184585528" lastFinishedPulling="2026-04-20 20:13:52.512166737 +0000 UTC m=+164.267537251" observedRunningTime="2026-04-20 20:13:53.379195331 +0000 UTC m=+165.134565864" watchObservedRunningTime="2026-04-20 20:13:53.379686583 +0000 UTC m=+165.135057114" Apr 20 20:13:53.394811 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:53.394767 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cxb87" podStartSLOduration=129.324481949 podStartE2EDuration="2m11.394747078s" podCreationTimestamp="2026-04-20 20:11:42 +0000 UTC" firstStartedPulling="2026-04-20 20:13:50.445798263 +0000 UTC m=+162.201168771" lastFinishedPulling="2026-04-20 20:13:52.516063385 +0000 UTC m=+164.271433900" observedRunningTime="2026-04-20 20:13:53.39408388 +0000 UTC m=+165.149454411" watchObservedRunningTime="2026-04-20 20:13:53.394747078 +0000 UTC m=+165.150117645" Apr 20 20:13:54.372402 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:54.372361 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c156e30d-3f28-45a6-b7eb-2e01a40bda41","Type":"ContainerStarted","Data":"02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959"} Apr 20 20:13:54.372846 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:54.372410 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c156e30d-3f28-45a6-b7eb-2e01a40bda41","Type":"ContainerStarted","Data":"ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae"} Apr 20 20:13:54.372846 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:54.372424 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c156e30d-3f28-45a6-b7eb-2e01a40bda41","Type":"ContainerStarted","Data":"8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3"} Apr 20 20:13:54.372846 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:54.372435 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c156e30d-3f28-45a6-b7eb-2e01a40bda41","Type":"ContainerStarted","Data":"e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2"} Apr 20 20:13:54.372846 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:54.372450 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c156e30d-3f28-45a6-b7eb-2e01a40bda41","Type":"ContainerStarted","Data":"e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b"} Apr 20 20:13:54.373813 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:54.373783 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" event={"ID":"8f710e3e-d3cc-474f-ba2d-1bebaec01dcf","Type":"ContainerStarted","Data":"8203f1b5fe150384b323937b58a6348b331aa4fe202a8abab3aee5df6632e8bd"} Apr 20 20:13:54.391362 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:54.391305 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" podStartSLOduration=1.6681926040000001 podStartE2EDuration="3.391288728s" podCreationTimestamp="2026-04-20 20:13:51 +0000 UTC" firstStartedPulling="2026-04-20 20:13:51.919103112 +0000 UTC m=+163.674473620" lastFinishedPulling="2026-04-20 20:13:53.642199235 +0000 UTC m=+165.397569744" observedRunningTime="2026-04-20 20:13:54.389245236 +0000 UTC m=+166.144615767" watchObservedRunningTime="2026-04-20 20:13:54.391288728 +0000 UTC m=+166.146659259" Apr 20 20:13:57.385075 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:57.385037 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c156e30d-3f28-45a6-b7eb-2e01a40bda41","Type":"ContainerStarted","Data":"7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c"} Apr 20 20:13:57.386823 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:57.386797 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" event={"ID":"505bd79a-a9c9-46d3-aea7-c25765078ece","Type":"ContainerStarted","Data":"9f74da48c07445116f2020eb7db1900cc11c5d112a9a299f7b64b0ef2613a126"} Apr 20 20:13:57.386823 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:57.386827 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" event={"ID":"505bd79a-a9c9-46d3-aea7-c25765078ece","Type":"ContainerStarted","Data":"6e25a707a97a46c33049c455fa67e269570dd909425432165e3d3758c51d819b"} Apr 20 20:13:57.387195 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:57.386838 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" event={"ID":"505bd79a-a9c9-46d3-aea7-c25765078ece","Type":"ContainerStarted","Data":"806f46cb8fb053464941fb2c51910b1c8454ce61b09a5e0038fcdd624e7b4919"} Apr 20 20:13:57.388200 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:57.388179 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-866497797c-4vsdx" event={"ID":"c7d5e596-f210-474b-935a-c90aefb9063f","Type":"ContainerStarted","Data":"3250f72021d330f02d9c851b10c1065e5fb35ba2764c740343fa35230b892987"} Apr 20 20:13:57.411520 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:57.411477 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.0523937 podStartE2EDuration="10.411465674s" podCreationTimestamp="2026-04-20 20:13:47 +0000 UTC" firstStartedPulling="2026-04-20 20:13:49.01000004 +0000 UTC m=+160.765370547" lastFinishedPulling="2026-04-20 20:13:56.369072 +0000 UTC m=+168.124442521" observedRunningTime="2026-04-20 20:13:57.40973066 +0000 UTC m=+169.165101192" watchObservedRunningTime="2026-04-20 20:13:57.411465674 +0000 UTC m=+169.166836204" Apr 20 20:13:57.432327 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:57.432286 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5997bd7fc9-n72rb" podStartSLOduration=1.723792499 podStartE2EDuration="5.432274886s" podCreationTimestamp="2026-04-20 20:13:52 +0000 UTC" firstStartedPulling="2026-04-20 20:13:52.660744758 +0000 UTC m=+164.416115269" lastFinishedPulling="2026-04-20 20:13:56.369227148 +0000 UTC m=+168.124597656" observedRunningTime="2026-04-20 20:13:57.430980614 +0000 UTC m=+169.186351144" watchObservedRunningTime="2026-04-20 20:13:57.432274886 +0000 UTC m=+169.187645415" Apr 20 20:13:57.446988 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:57.446930 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-866497797c-4vsdx" podStartSLOduration=2.755079825 podStartE2EDuration="6.446918396s" podCreationTimestamp="2026-04-20 20:13:51 +0000 UTC" firstStartedPulling="2026-04-20 20:13:52.684101247 +0000 UTC m=+164.439471761" lastFinishedPulling="2026-04-20 20:13:56.375939823 +0000 UTC m=+168.131310332" observedRunningTime="2026-04-20 20:13:57.446405724 +0000 UTC m=+169.201776253" watchObservedRunningTime="2026-04-20 20:13:57.446918396 +0000 UTC m=+169.202288925" Apr 20 20:13:57.620864 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:57.620833 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-866497797c-4vsdx"] Apr 20 20:13:58.351645 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:58.351614 2580 patch_prober.go:28] interesting pod/image-registry-cd4b6f4f4-kcq97 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 20:13:58.351801 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:58.351663 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" podUID="cd3d12ef-f755-4800-bd09-5d0005e38f41" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:13:59.075777 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:59.075718 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" podUID="8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 20:13:59.076470 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:59.076451 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" Apr 20 20:13:59.077688 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:59.077647 2580 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"7930f35a958027da3b9c72cf144e170121a46a7558e640a7dfc143549cdf83c0"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 20:13:59.077790 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:59.077711 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" podUID="8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf" containerName="service-proxy" containerID="cri-o://7930f35a958027da3b9c72cf144e170121a46a7558e640a7dfc143549cdf83c0" gracePeriod=30 Apr 20 20:13:59.396159 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:59.396068 2580 generic.go:358] "Generic (PLEG): container finished" podID="8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf" containerID="7930f35a958027da3b9c72cf144e170121a46a7558e640a7dfc143549cdf83c0" exitCode=2 Apr 20 20:13:59.396159 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:59.396142 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" event={"ID":"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf","Type":"ContainerDied","Data":"7930f35a958027da3b9c72cf144e170121a46a7558e640a7dfc143549cdf83c0"} Apr 20 20:13:59.396371 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:59.396182 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5bf9cc7877-b92jx" event={"ID":"8e0ee80e-4f8c-43ce-a9fd-dc8ca98ee0bf","Type":"ContainerStarted","Data":"cefa9bf26e965e5f135cd06ffed0ef4a7e39c0345365b3a2a4df6677af08319c"} Apr 20 20:13:59.820521 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:13:59.820485 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:14:00.315016 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:00.314982 2580 patch_prober.go:28] interesting pod/image-registry-cd4b6f4f4-kcq97 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 20:14:00.315349 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:00.315033 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" podUID="cd3d12ef-f755-4800-bd09-5d0005e38f41" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:14:01.712446 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:01.712392 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:14:03.376574 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:03.376548 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6b4d9" Apr 20 20:14:08.350746 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:08.350712 2580 patch_prober.go:28] interesting pod/image-registry-cd4b6f4f4-kcq97 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 20:14:08.351112 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:08.350780 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" podUID="cd3d12ef-f755-4800-bd09-5d0005e38f41" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:14:10.315364 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:10.315338 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-cd4b6f4f4-kcq97" Apr 20 20:14:11.440661 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:11.440632 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:14:11.440661 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:11.440663 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:14:24.415773 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.415710 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-866497797c-4vsdx" podUID="c7d5e596-f210-474b-935a-c90aefb9063f" containerName="console" containerID="cri-o://3250f72021d330f02d9c851b10c1065e5fb35ba2764c740343fa35230b892987" gracePeriod=15 Apr 20 20:14:24.648308 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.648288 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-866497797c-4vsdx_c7d5e596-f210-474b-935a-c90aefb9063f/console/0.log" Apr 20 20:14:24.648410 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.648353 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:14:24.766287 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.766261 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7d5e596-f210-474b-935a-c90aefb9063f-console-oauth-config\") pod \"c7d5e596-f210-474b-935a-c90aefb9063f\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " Apr 20 20:14:24.766439 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.766296 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-trusted-ca-bundle\") pod \"c7d5e596-f210-474b-935a-c90aefb9063f\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " Apr 20 20:14:24.766439 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.766316 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-console-config\") pod \"c7d5e596-f210-474b-935a-c90aefb9063f\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " Apr 20 20:14:24.766439 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.766348 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-service-ca\") pod \"c7d5e596-f210-474b-935a-c90aefb9063f\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " Apr 20 20:14:24.766439 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.766377 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6vgd\" (UniqueName: \"kubernetes.io/projected/c7d5e596-f210-474b-935a-c90aefb9063f-kube-api-access-d6vgd\") pod \"c7d5e596-f210-474b-935a-c90aefb9063f\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " Apr 20 20:14:24.766439 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.766399 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d5e596-f210-474b-935a-c90aefb9063f-console-serving-cert\") pod \"c7d5e596-f210-474b-935a-c90aefb9063f\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " Apr 20 20:14:24.766690 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.766470 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-oauth-serving-cert\") pod \"c7d5e596-f210-474b-935a-c90aefb9063f\" (UID: \"c7d5e596-f210-474b-935a-c90aefb9063f\") " Apr 20 20:14:24.766746 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.766681 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c7d5e596-f210-474b-935a-c90aefb9063f" (UID: "c7d5e596-f210-474b-935a-c90aefb9063f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:24.766838 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.766799 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-console-config" (OuterVolumeSpecName: "console-config") pod "c7d5e596-f210-474b-935a-c90aefb9063f" (UID: "c7d5e596-f210-474b-935a-c90aefb9063f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:24.766934 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.766911 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-service-ca" (OuterVolumeSpecName: "service-ca") pod "c7d5e596-f210-474b-935a-c90aefb9063f" (UID: "c7d5e596-f210-474b-935a-c90aefb9063f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:24.767014 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.766984 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c7d5e596-f210-474b-935a-c90aefb9063f" (UID: "c7d5e596-f210-474b-935a-c90aefb9063f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:24.768714 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.768688 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d5e596-f210-474b-935a-c90aefb9063f-kube-api-access-d6vgd" (OuterVolumeSpecName: "kube-api-access-d6vgd") pod "c7d5e596-f210-474b-935a-c90aefb9063f" (UID: "c7d5e596-f210-474b-935a-c90aefb9063f"). InnerVolumeSpecName "kube-api-access-d6vgd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:14:24.769154 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.769133 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d5e596-f210-474b-935a-c90aefb9063f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c7d5e596-f210-474b-935a-c90aefb9063f" (UID: "c7d5e596-f210-474b-935a-c90aefb9063f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:14:24.769228 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.769150 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d5e596-f210-474b-935a-c90aefb9063f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c7d5e596-f210-474b-935a-c90aefb9063f" (UID: "c7d5e596-f210-474b-935a-c90aefb9063f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:14:24.867204 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.867175 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d6vgd\" (UniqueName: \"kubernetes.io/projected/c7d5e596-f210-474b-935a-c90aefb9063f-kube-api-access-d6vgd\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:14:24.867204 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.867201 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d5e596-f210-474b-935a-c90aefb9063f-console-serving-cert\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:14:24.867363 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.867212 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-oauth-serving-cert\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:14:24.867363 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.867221 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7d5e596-f210-474b-935a-c90aefb9063f-console-oauth-config\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:14:24.867363 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.867230 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-trusted-ca-bundle\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:14:24.867363 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.867239 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-console-config\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:14:24.867363 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:24.867248 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7d5e596-f210-474b-935a-c90aefb9063f-service-ca\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:14:25.465801 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:25.465776 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-866497797c-4vsdx_c7d5e596-f210-474b-935a-c90aefb9063f/console/0.log" Apr 20 20:14:25.466285 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:25.465815 2580 generic.go:358] "Generic (PLEG): container finished" podID="c7d5e596-f210-474b-935a-c90aefb9063f" containerID="3250f72021d330f02d9c851b10c1065e5fb35ba2764c740343fa35230b892987" exitCode=2 Apr 20 20:14:25.466285 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:25.465855 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-866497797c-4vsdx" event={"ID":"c7d5e596-f210-474b-935a-c90aefb9063f","Type":"ContainerDied","Data":"3250f72021d330f02d9c851b10c1065e5fb35ba2764c740343fa35230b892987"} Apr 20 20:14:25.466285 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:25.465876 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-866497797c-4vsdx" event={"ID":"c7d5e596-f210-474b-935a-c90aefb9063f","Type":"ContainerDied","Data":"32e0e2aff131cd70b67be43fb686dd6013c77e8db077219e07374046c60198b9"} Apr 20 20:14:25.466285 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:25.465885 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-866497797c-4vsdx" Apr 20 20:14:25.466285 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:25.465890 2580 scope.go:117] "RemoveContainer" containerID="3250f72021d330f02d9c851b10c1065e5fb35ba2764c740343fa35230b892987" Apr 20 20:14:25.473634 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:25.473616 2580 scope.go:117] "RemoveContainer" containerID="3250f72021d330f02d9c851b10c1065e5fb35ba2764c740343fa35230b892987" Apr 20 20:14:25.473863 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:14:25.473844 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3250f72021d330f02d9c851b10c1065e5fb35ba2764c740343fa35230b892987\": container with ID starting with 3250f72021d330f02d9c851b10c1065e5fb35ba2764c740343fa35230b892987 not found: ID does not exist" containerID="3250f72021d330f02d9c851b10c1065e5fb35ba2764c740343fa35230b892987" Apr 20 20:14:25.473922 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:25.473874 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3250f72021d330f02d9c851b10c1065e5fb35ba2764c740343fa35230b892987"} err="failed to get container status \"3250f72021d330f02d9c851b10c1065e5fb35ba2764c740343fa35230b892987\": rpc error: code = NotFound desc = could not find container \"3250f72021d330f02d9c851b10c1065e5fb35ba2764c740343fa35230b892987\": container with ID starting with 3250f72021d330f02d9c851b10c1065e5fb35ba2764c740343fa35230b892987 not found: ID does not exist" Apr 20 20:14:25.483877 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:25.483851 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-866497797c-4vsdx"] Apr 20 20:14:25.485753 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:25.485735 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-866497797c-4vsdx"] Apr 20 20:14:26.823733 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:26.823697 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d5e596-f210-474b-935a-c90aefb9063f" path="/var/lib/kubelet/pods/c7d5e596-f210-474b-935a-c90aefb9063f/volumes" Apr 20 20:14:31.445133 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:31.445104 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:14:31.448891 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:14:31.448864 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5fb54d5cbb-nm9lw" Apr 20 20:15:03.446069 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.446029 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-79b5459745-gr874"] Apr 20 20:15:03.446505 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.446287 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7d5e596-f210-474b-935a-c90aefb9063f" containerName="console" Apr 20 20:15:03.446505 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.446297 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d5e596-f210-474b-935a-c90aefb9063f" containerName="console" Apr 20 20:15:03.446505 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.446348 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7d5e596-f210-474b-935a-c90aefb9063f" containerName="console" Apr 20 20:15:03.450256 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.450237 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.452943 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.452918 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 20:15:03.452943 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.452933 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-m57vs\"" Apr 20 20:15:03.453142 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.452926 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 20:15:03.453142 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.453008 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 20:15:03.453250 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.453158 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 20:15:03.453305 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.453258 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 20:15:03.453305 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.453275 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 20:15:03.454243 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.454225 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 20:15:03.458685 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.458664 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 20:15:03.460150 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.460131 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79b5459745-gr874"] Apr 20 20:15:03.581730 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.581704 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-oauth-serving-cert\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.581894 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.581760 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kknmf\" (UniqueName: \"kubernetes.io/projected/0bd5486b-d629-4252-8574-eae4ce3d45d7-kube-api-access-kknmf\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.581894 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.581836 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-trusted-ca-bundle\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.581894 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.581879 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-oauth-config\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.582067 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.581902 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-service-ca\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.582067 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.581925 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-serving-cert\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.582067 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.581969 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-config\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.683209 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.683176 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-trusted-ca-bundle\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.683377 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.683220 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-oauth-config\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.683377 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.683238 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-service-ca\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.683377 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.683256 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-serving-cert\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.683377 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.683278 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-config\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.683377 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.683327 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-oauth-serving-cert\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.683627 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.683474 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kknmf\" (UniqueName: \"kubernetes.io/projected/0bd5486b-d629-4252-8574-eae4ce3d45d7-kube-api-access-kknmf\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.684001 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.683982 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-service-ca\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.684269 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.684249 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-trusted-ca-bundle\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.684337 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.684314 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-config\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.684377 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.684314 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-oauth-serving-cert\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.685898 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.685870 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-serving-cert\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.686011 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.685928 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-oauth-config\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.692617 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.692599 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kknmf\" (UniqueName: \"kubernetes.io/projected/0bd5486b-d629-4252-8574-eae4ce3d45d7-kube-api-access-kknmf\") pod \"console-79b5459745-gr874\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.760295 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.760273 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:03.878204 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:03.878175 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79b5459745-gr874"] Apr 20 20:15:03.880614 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:15:03.880589 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bd5486b_d629_4252_8574_eae4ce3d45d7.slice/crio-25a467149d982b6561a087751a2bc5477b656efb7104d9cc99f00a68e6d4a552 WatchSource:0}: Error finding container 25a467149d982b6561a087751a2bc5477b656efb7104d9cc99f00a68e6d4a552: Status 404 returned error can't find the container with id 25a467149d982b6561a087751a2bc5477b656efb7104d9cc99f00a68e6d4a552 Apr 20 20:15:04.569779 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:04.569744 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79b5459745-gr874" event={"ID":"0bd5486b-d629-4252-8574-eae4ce3d45d7","Type":"ContainerStarted","Data":"15d8fcd00b2bf77604722e9fc1076ebd4dd833cca1dcbada847234b4627e399d"} Apr 20 20:15:04.569779 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:04.569779 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79b5459745-gr874" event={"ID":"0bd5486b-d629-4252-8574-eae4ce3d45d7","Type":"ContainerStarted","Data":"25a467149d982b6561a087751a2bc5477b656efb7104d9cc99f00a68e6d4a552"} Apr 20 20:15:04.586928 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:04.586887 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79b5459745-gr874" podStartSLOduration=1.5868742550000001 podStartE2EDuration="1.586874255s" podCreationTimestamp="2026-04-20 20:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:15:04.586152189 +0000 UTC m=+236.341522731" watchObservedRunningTime="2026-04-20 20:15:04.586874255 +0000 UTC m=+236.342244785" Apr 20 20:15:07.053270 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:07.053235 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:15:07.053742 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:07.053719 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="alertmanager" containerID="cri-o://e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b" gracePeriod=120 Apr 20 20:15:07.053989 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:07.053781 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="kube-rbac-proxy-metric" containerID="cri-o://02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959" gracePeriod=120 Apr 20 20:15:07.053989 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:07.053812 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="prom-label-proxy" containerID="cri-o://7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c" gracePeriod=120 Apr 20 20:15:07.053989 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:07.053867 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="config-reloader" containerID="cri-o://e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2" gracePeriod=120 Apr 20 20:15:07.053989 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:07.053832 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="kube-rbac-proxy" containerID="cri-o://ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae" gracePeriod=120 Apr 20 20:15:07.053989 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:07.053830 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="kube-rbac-proxy-web" containerID="cri-o://8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3" gracePeriod=120 Apr 20 20:15:07.585056 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:07.585026 2580 generic.go:358] "Generic (PLEG): container finished" podID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerID="7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c" exitCode=0 Apr 20 20:15:07.585056 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:07.585052 2580 generic.go:358] "Generic (PLEG): container finished" podID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerID="ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae" exitCode=0 Apr 20 20:15:07.585056 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:07.585059 2580 generic.go:358] "Generic (PLEG): container finished" podID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerID="e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2" exitCode=0 Apr 20 20:15:07.585283 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:07.585066 2580 generic.go:358] "Generic (PLEG): container finished" podID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerID="e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b" exitCode=0 Apr 20 20:15:07.585283 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:07.585098 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c156e30d-3f28-45a6-b7eb-2e01a40bda41","Type":"ContainerDied","Data":"7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c"} Apr 20 20:15:07.585283 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:07.585130 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c156e30d-3f28-45a6-b7eb-2e01a40bda41","Type":"ContainerDied","Data":"ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae"} Apr 20 20:15:07.585283 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:07.585140 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c156e30d-3f28-45a6-b7eb-2e01a40bda41","Type":"ContainerDied","Data":"e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2"} Apr 20 20:15:07.585283 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:07.585148 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c156e30d-3f28-45a6-b7eb-2e01a40bda41","Type":"ContainerDied","Data":"e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b"} Apr 20 20:15:08.291887 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.291866 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.420858 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.420780 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c156e30d-3f28-45a6-b7eb-2e01a40bda41-config-out\") pod \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " Apr 20 20:15:08.420858 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.420829 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy-web\") pod \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " Apr 20 20:15:08.421099 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.420880 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9vpj\" (UniqueName: \"kubernetes.io/projected/c156e30d-3f28-45a6-b7eb-2e01a40bda41-kube-api-access-n9vpj\") pod \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " Apr 20 20:15:08.421099 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.420908 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c156e30d-3f28-45a6-b7eb-2e01a40bda41-metrics-client-ca\") pod \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " Apr 20 20:15:08.421099 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.420933 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-web-config\") pod \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " Apr 20 20:15:08.421099 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.420987 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c156e30d-3f28-45a6-b7eb-2e01a40bda41-alertmanager-trusted-ca-bundle\") pod \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " Apr 20 20:15:08.421099 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.421023 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy\") pod \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " Apr 20 20:15:08.421099 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.421047 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c156e30d-3f28-45a6-b7eb-2e01a40bda41-alertmanager-main-db\") pod \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " Apr 20 20:15:08.421099 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.421090 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-config-volume\") pod \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " Apr 20 20:15:08.421422 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.421124 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c156e30d-3f28-45a6-b7eb-2e01a40bda41-tls-assets\") pod \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " Apr 20 20:15:08.421422 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.421167 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy-metric\") pod \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " Apr 20 20:15:08.421422 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.421189 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-cluster-tls-config\") pod \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " Apr 20 20:15:08.421422 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.421230 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-main-tls\") pod \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\" (UID: \"c156e30d-3f28-45a6-b7eb-2e01a40bda41\") " Apr 20 20:15:08.421422 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.421326 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c156e30d-3f28-45a6-b7eb-2e01a40bda41-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "c156e30d-3f28-45a6-b7eb-2e01a40bda41" (UID: "c156e30d-3f28-45a6-b7eb-2e01a40bda41"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:15:08.421651 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.421555 2580 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c156e30d-3f28-45a6-b7eb-2e01a40bda41-metrics-client-ca\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:15:08.422099 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.421743 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c156e30d-3f28-45a6-b7eb-2e01a40bda41-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "c156e30d-3f28-45a6-b7eb-2e01a40bda41" (UID: "c156e30d-3f28-45a6-b7eb-2e01a40bda41"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:15:08.423267 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.423231 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c156e30d-3f28-45a6-b7eb-2e01a40bda41-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "c156e30d-3f28-45a6-b7eb-2e01a40bda41" (UID: "c156e30d-3f28-45a6-b7eb-2e01a40bda41"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:15:08.424175 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.424137 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-config-volume" (OuterVolumeSpecName: "config-volume") pod "c156e30d-3f28-45a6-b7eb-2e01a40bda41" (UID: "c156e30d-3f28-45a6-b7eb-2e01a40bda41"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:08.424464 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.424424 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c156e30d-3f28-45a6-b7eb-2e01a40bda41-config-out" (OuterVolumeSpecName: "config-out") pod "c156e30d-3f28-45a6-b7eb-2e01a40bda41" (UID: "c156e30d-3f28-45a6-b7eb-2e01a40bda41"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:15:08.424464 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.424445 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "c156e30d-3f28-45a6-b7eb-2e01a40bda41" (UID: "c156e30d-3f28-45a6-b7eb-2e01a40bda41"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:08.424614 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.424485 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "c156e30d-3f28-45a6-b7eb-2e01a40bda41" (UID: "c156e30d-3f28-45a6-b7eb-2e01a40bda41"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:08.424614 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.424586 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c156e30d-3f28-45a6-b7eb-2e01a40bda41-kube-api-access-n9vpj" (OuterVolumeSpecName: "kube-api-access-n9vpj") pod "c156e30d-3f28-45a6-b7eb-2e01a40bda41" (UID: "c156e30d-3f28-45a6-b7eb-2e01a40bda41"). InnerVolumeSpecName "kube-api-access-n9vpj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:15:08.424865 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.424839 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "c156e30d-3f28-45a6-b7eb-2e01a40bda41" (UID: "c156e30d-3f28-45a6-b7eb-2e01a40bda41"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:08.425153 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.425129 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "c156e30d-3f28-45a6-b7eb-2e01a40bda41" (UID: "c156e30d-3f28-45a6-b7eb-2e01a40bda41"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:08.426061 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.426039 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c156e30d-3f28-45a6-b7eb-2e01a40bda41-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c156e30d-3f28-45a6-b7eb-2e01a40bda41" (UID: "c156e30d-3f28-45a6-b7eb-2e01a40bda41"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:15:08.429176 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.429150 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "c156e30d-3f28-45a6-b7eb-2e01a40bda41" (UID: "c156e30d-3f28-45a6-b7eb-2e01a40bda41"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:08.434213 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.434193 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-web-config" (OuterVolumeSpecName: "web-config") pod "c156e30d-3f28-45a6-b7eb-2e01a40bda41" (UID: "c156e30d-3f28-45a6-b7eb-2e01a40bda41"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:08.522709 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.522685 2580 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c156e30d-3f28-45a6-b7eb-2e01a40bda41-config-out\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:15:08.522709 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.522709 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:15:08.522835 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.522720 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n9vpj\" (UniqueName: \"kubernetes.io/projected/c156e30d-3f28-45a6-b7eb-2e01a40bda41-kube-api-access-n9vpj\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:15:08.522835 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.522730 2580 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-web-config\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:15:08.522835 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.522738 2580 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c156e30d-3f28-45a6-b7eb-2e01a40bda41-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:15:08.522835 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.522747 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:15:08.522835 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.522756 2580 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c156e30d-3f28-45a6-b7eb-2e01a40bda41-alertmanager-main-db\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:15:08.522835 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.522764 2580 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-config-volume\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:15:08.522835 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.522772 2580 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c156e30d-3f28-45a6-b7eb-2e01a40bda41-tls-assets\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:15:08.522835 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.522781 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:15:08.522835 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.522790 2580 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-cluster-tls-config\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:15:08.522835 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.522799 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c156e30d-3f28-45a6-b7eb-2e01a40bda41-secret-alertmanager-main-tls\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:15:08.590560 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.590533 2580 generic.go:358] "Generic (PLEG): container finished" podID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerID="02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959" exitCode=0 Apr 20 20:15:08.590560 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.590554 2580 generic.go:358] "Generic (PLEG): container finished" podID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerID="8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3" exitCode=0 Apr 20 20:15:08.590727 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.590608 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c156e30d-3f28-45a6-b7eb-2e01a40bda41","Type":"ContainerDied","Data":"02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959"} Apr 20 20:15:08.590727 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.590643 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c156e30d-3f28-45a6-b7eb-2e01a40bda41","Type":"ContainerDied","Data":"8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3"} Apr 20 20:15:08.590727 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.590654 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c156e30d-3f28-45a6-b7eb-2e01a40bda41","Type":"ContainerDied","Data":"9c359448af4dfef69a1df89aec19b9f9c3ade3db02c4ad4afd5bbd0b41eed6eb"} Apr 20 20:15:08.590727 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.590669 2580 scope.go:117] "RemoveContainer" containerID="7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c" Apr 20 20:15:08.590727 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.590669 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.598213 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.598197 2580 scope.go:117] "RemoveContainer" containerID="02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959" Apr 20 20:15:08.604556 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.604539 2580 scope.go:117] "RemoveContainer" containerID="ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae" Apr 20 20:15:08.612500 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.612478 2580 scope.go:117] "RemoveContainer" containerID="8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3" Apr 20 20:15:08.613851 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.613834 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:15:08.617848 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.617827 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:15:08.619529 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.619515 2580 scope.go:117] "RemoveContainer" containerID="e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2" Apr 20 20:15:08.625534 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.625517 2580 scope.go:117] "RemoveContainer" containerID="e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b" Apr 20 20:15:08.631886 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.631864 2580 scope.go:117] "RemoveContainer" containerID="20a6c8e7d71a10b325177d05a48a5a4e0d3c339ecb8ff2dbedd22ee1dff4ac27" Apr 20 20:15:08.637943 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.637915 2580 scope.go:117] "RemoveContainer" containerID="7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c" Apr 20 20:15:08.638247 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:15:08.638210 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c\": container with ID starting with 7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c not found: ID does not exist" containerID="7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c" Apr 20 20:15:08.638328 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.638247 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c"} err="failed to get container status \"7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c\": rpc error: code = NotFound desc = could not find container \"7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c\": container with ID starting with 7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c not found: ID does not exist" Apr 20 20:15:08.638328 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.638274 2580 scope.go:117] "RemoveContainer" containerID="02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959" Apr 20 20:15:08.638594 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:15:08.638575 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959\": container with ID starting with 02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959 not found: ID does not exist" containerID="02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959" Apr 20 20:15:08.638661 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.638601 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959"} err="failed to get container status \"02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959\": rpc error: code = NotFound desc = could not find container \"02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959\": container with ID starting with 02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959 not found: ID does not exist" Apr 20 20:15:08.638661 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.638619 2580 scope.go:117] "RemoveContainer" containerID="ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae" Apr 20 20:15:08.638906 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:15:08.638886 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae\": container with ID starting with ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae not found: ID does not exist" containerID="ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae" Apr 20 20:15:08.638981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.638913 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae"} err="failed to get container status \"ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae\": rpc error: code = NotFound desc = could not find container \"ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae\": container with ID starting with ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae not found: ID does not exist" Apr 20 20:15:08.638981 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.638930 2580 scope.go:117] "RemoveContainer" containerID="8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3" Apr 20 20:15:08.639196 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:15:08.639180 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3\": container with ID starting with 8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3 not found: ID does not exist" containerID="8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3" Apr 20 20:15:08.639238 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.639199 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3"} err="failed to get container status \"8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3\": rpc error: code = NotFound desc = could not find container \"8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3\": container with ID starting with 8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3 not found: ID does not exist" Apr 20 20:15:08.639238 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.639222 2580 scope.go:117] "RemoveContainer" containerID="e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2" Apr 20 20:15:08.639472 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:15:08.639451 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2\": container with ID starting with e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2 not found: ID does not exist" containerID="e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2" Apr 20 20:15:08.639522 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.639481 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2"} err="failed to get container status \"e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2\": rpc error: code = NotFound desc = could not find container \"e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2\": container with ID starting with e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2 not found: ID does not exist" Apr 20 20:15:08.639522 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.639504 2580 scope.go:117] "RemoveContainer" containerID="e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b" Apr 20 20:15:08.639736 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:15:08.639715 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b\": container with ID starting with e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b not found: ID does not exist" containerID="e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b" Apr 20 20:15:08.639823 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.639739 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b"} err="failed to get container status \"e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b\": rpc error: code = NotFound desc = could not find container \"e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b\": container with ID starting with e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b not found: ID does not exist" Apr 20 20:15:08.639823 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.639754 2580 scope.go:117] "RemoveContainer" containerID="20a6c8e7d71a10b325177d05a48a5a4e0d3c339ecb8ff2dbedd22ee1dff4ac27" Apr 20 20:15:08.639823 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.639815 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:15:08.640017 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:15:08.639988 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20a6c8e7d71a10b325177d05a48a5a4e0d3c339ecb8ff2dbedd22ee1dff4ac27\": container with ID starting with 20a6c8e7d71a10b325177d05a48a5a4e0d3c339ecb8ff2dbedd22ee1dff4ac27 not found: ID does not exist" containerID="20a6c8e7d71a10b325177d05a48a5a4e0d3c339ecb8ff2dbedd22ee1dff4ac27" Apr 20 20:15:08.640017 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640003 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20a6c8e7d71a10b325177d05a48a5a4e0d3c339ecb8ff2dbedd22ee1dff4ac27"} err="failed to get container status \"20a6c8e7d71a10b325177d05a48a5a4e0d3c339ecb8ff2dbedd22ee1dff4ac27\": rpc error: code = NotFound desc = could not find container \"20a6c8e7d71a10b325177d05a48a5a4e0d3c339ecb8ff2dbedd22ee1dff4ac27\": container with ID starting with 20a6c8e7d71a10b325177d05a48a5a4e0d3c339ecb8ff2dbedd22ee1dff4ac27 not found: ID does not exist" Apr 20 20:15:08.640017 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640015 2580 scope.go:117] "RemoveContainer" containerID="7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c" Apr 20 20:15:08.640219 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640202 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="config-reloader" Apr 20 20:15:08.640280 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640222 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="config-reloader" Apr 20 20:15:08.640280 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640223 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c"} err="failed to get container status \"7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c\": rpc error: code = NotFound desc = could not find container \"7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c\": container with ID starting with 7ef88b4ae3c3783ece2536d67213409f54d7991749720fcecc4bcec3a776845c not found: ID does not exist" Apr 20 20:15:08.640280 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640236 2580 scope.go:117] "RemoveContainer" containerID="02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959" Apr 20 20:15:08.640280 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640269 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="alertmanager" Apr 20 20:15:08.640280 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640279 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="alertmanager" Apr 20 20:15:08.640515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640300 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="kube-rbac-proxy-metric" Apr 20 20:15:08.640515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640309 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="kube-rbac-proxy-metric" Apr 20 20:15:08.640515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640327 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="kube-rbac-proxy" Apr 20 20:15:08.640515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640336 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="kube-rbac-proxy" Apr 20 20:15:08.640515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640348 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="init-config-reloader" Apr 20 20:15:08.640515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640357 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="init-config-reloader" Apr 20 20:15:08.640515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640371 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="prom-label-proxy" Apr 20 20:15:08.640515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640380 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="prom-label-proxy" Apr 20 20:15:08.640515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640389 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="kube-rbac-proxy-web" Apr 20 20:15:08.640515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640398 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="kube-rbac-proxy-web" Apr 20 20:15:08.640515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640445 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959"} err="failed to get container status \"02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959\": rpc error: code = NotFound desc = could not find container \"02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959\": container with ID starting with 02e1a72c9f3e19df639d59b62601661f70b7abeac7c5497feefe348a8239b959 not found: ID does not exist" Apr 20 20:15:08.640515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640463 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="alertmanager" Apr 20 20:15:08.640515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640477 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="kube-rbac-proxy" Apr 20 20:15:08.640515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640487 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="kube-rbac-proxy-metric" Apr 20 20:15:08.640515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640497 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="kube-rbac-proxy-web" Apr 20 20:15:08.640515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640506 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="prom-label-proxy" Apr 20 20:15:08.640515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640517 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" containerName="config-reloader" Apr 20 20:15:08.641296 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640466 2580 scope.go:117] "RemoveContainer" containerID="ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae" Apr 20 20:15:08.641296 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640737 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae"} err="failed to get container status \"ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae\": rpc error: code = NotFound desc = could not find container \"ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae\": container with ID starting with ebcfce2bf88771452674ce6705278b8d19ed90d300674c50aa8a87d5657ba3ae not found: ID does not exist" Apr 20 20:15:08.641296 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.640757 2580 scope.go:117] "RemoveContainer" containerID="8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3" Apr 20 20:15:08.641296 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.641016 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3"} err="failed to get container status \"8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3\": rpc error: code = NotFound desc = could not find container \"8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3\": container with ID starting with 8156916ed92ff61c9c971c45657c9eb9a6c343342f984bf6461cdb78688191e3 not found: ID does not exist" Apr 20 20:15:08.641296 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.641037 2580 scope.go:117] "RemoveContainer" containerID="e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2" Apr 20 20:15:08.641296 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.641265 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2"} err="failed to get container status \"e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2\": rpc error: code = NotFound desc = could not find container \"e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2\": container with ID starting with e8c23c84b78dd4206018efaf483a8ded2447ac3af3265b45abb373dc656663b2 not found: ID does not exist" Apr 20 20:15:08.641296 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.641280 2580 scope.go:117] "RemoveContainer" containerID="e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b" Apr 20 20:15:08.641604 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.641476 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b"} err="failed to get container status \"e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b\": rpc error: code = NotFound desc = could not find container \"e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b\": container with ID starting with e3b5c1192bd818876eeb10af3768d0a2c6d68ee5cb2d3ed83f5d3aa11f35a84b not found: ID does not exist" Apr 20 20:15:08.641604 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.641490 2580 scope.go:117] "RemoveContainer" containerID="20a6c8e7d71a10b325177d05a48a5a4e0d3c339ecb8ff2dbedd22ee1dff4ac27" Apr 20 20:15:08.641704 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.641669 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20a6c8e7d71a10b325177d05a48a5a4e0d3c339ecb8ff2dbedd22ee1dff4ac27"} err="failed to get container status \"20a6c8e7d71a10b325177d05a48a5a4e0d3c339ecb8ff2dbedd22ee1dff4ac27\": rpc error: code = NotFound desc = could not find container \"20a6c8e7d71a10b325177d05a48a5a4e0d3c339ecb8ff2dbedd22ee1dff4ac27\": container with ID starting with 20a6c8e7d71a10b325177d05a48a5a4e0d3c339ecb8ff2dbedd22ee1dff4ac27 not found: ID does not exist" Apr 20 20:15:08.645665 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.645645 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.648452 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.648432 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 20:15:08.648558 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.648441 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 20:15:08.648558 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.648507 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 20:15:08.648558 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.648552 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 20:15:08.648710 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.648507 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 20:15:08.649032 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.649008 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 20:15:08.649032 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.649018 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 20:15:08.649157 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.649018 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 20:15:08.649157 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.649075 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-nwd95\"" Apr 20 20:15:08.653713 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.653696 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 20:15:08.658855 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.658829 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:15:08.724891 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.724867 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.725004 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.724902 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d57f93f0-bc8e-4a80-8582-4faac193738d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.725004 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.724920 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.725004 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.724960 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-web-config\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.725004 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.724984 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.725178 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.725023 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrb48\" (UniqueName: \"kubernetes.io/projected/d57f93f0-bc8e-4a80-8582-4faac193738d-kube-api-access-hrb48\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.725178 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.725042 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d57f93f0-bc8e-4a80-8582-4faac193738d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.725178 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.725088 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d57f93f0-bc8e-4a80-8582-4faac193738d-config-out\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.725178 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.725112 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.725321 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.725174 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d57f93f0-bc8e-4a80-8582-4faac193738d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.725321 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.725225 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-config-volume\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.725321 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.725246 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d57f93f0-bc8e-4a80-8582-4faac193738d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.725321 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.725264 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.824285 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.824264 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c156e30d-3f28-45a6-b7eb-2e01a40bda41" path="/var/lib/kubelet/pods/c156e30d-3f28-45a6-b7eb-2e01a40bda41/volumes" Apr 20 20:15:08.826223 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.826206 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d57f93f0-bc8e-4a80-8582-4faac193738d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.826277 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.826235 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.826277 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.826260 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.826375 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.826356 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d57f93f0-bc8e-4a80-8582-4faac193738d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.826422 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.826397 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.826457 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.826436 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-web-config\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.826505 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.826462 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.826505 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.826498 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrb48\" (UniqueName: \"kubernetes.io/projected/d57f93f0-bc8e-4a80-8582-4faac193738d-kube-api-access-hrb48\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.826602 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.826523 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d57f93f0-bc8e-4a80-8582-4faac193738d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.826602 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.826552 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d57f93f0-bc8e-4a80-8582-4faac193738d-config-out\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.826602 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.826578 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.826720 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.826610 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d57f93f0-bc8e-4a80-8582-4faac193738d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.826720 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.826660 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-config-volume\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.827005 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.826988 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d57f93f0-bc8e-4a80-8582-4faac193738d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.827152 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.827135 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d57f93f0-bc8e-4a80-8582-4faac193738d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.828688 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.828669 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d57f93f0-bc8e-4a80-8582-4faac193738d-config-out\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.829161 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.829140 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 20:15:08.829161 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.829155 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 20:15:08.829276 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.829144 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 20:15:08.829276 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.829200 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 20:15:08.829440 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.829427 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 20:15:08.829498 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.829432 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 20:15:08.829595 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.829578 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 20:15:08.829647 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.829598 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 20:15:08.833197 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.833182 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 20:15:08.836051 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.836037 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrb48\" (UniqueName: \"kubernetes.io/projected/d57f93f0-bc8e-4a80-8582-4faac193738d-kube-api-access-hrb48\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.837987 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.837942 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d57f93f0-bc8e-4a80-8582-4faac193738d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.839701 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.839589 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.839701 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.839650 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d57f93f0-bc8e-4a80-8582-4faac193738d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.839816 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.839709 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.839816 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.839783 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-web-config\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.839816 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.839790 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-config-volume\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.839899 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.839859 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.839932 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.839894 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.840368 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.840347 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d57f93f0-bc8e-4a80-8582-4faac193738d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d57f93f0-bc8e-4a80-8582-4faac193738d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:08.957418 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.957369 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-nwd95\"" Apr 20 20:15:08.965526 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:08.965509 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:15:09.088063 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:09.087870 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:15:09.593860 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:09.593819 2580 generic.go:358] "Generic (PLEG): container finished" podID="d57f93f0-bc8e-4a80-8582-4faac193738d" containerID="6a8de1ffb05e4eee40a95011de7f7c95fdc9bb7fe106bf7a6040649af312f545" exitCode=0 Apr 20 20:15:09.594237 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:09.593882 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d57f93f0-bc8e-4a80-8582-4faac193738d","Type":"ContainerDied","Data":"6a8de1ffb05e4eee40a95011de7f7c95fdc9bb7fe106bf7a6040649af312f545"} Apr 20 20:15:09.594237 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:09.593916 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d57f93f0-bc8e-4a80-8582-4faac193738d","Type":"ContainerStarted","Data":"64537e53b49613bd7004ed1d77563d91dc9c3e1159436dcd640f367f6b5167bc"} Apr 20 20:15:10.599200 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:10.599169 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d57f93f0-bc8e-4a80-8582-4faac193738d","Type":"ContainerStarted","Data":"b7f765f1d377cfc4dfcaed7125e3786632aac7ef62b03c8c82f3db169717af41"} Apr 20 20:15:10.599200 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:10.599206 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d57f93f0-bc8e-4a80-8582-4faac193738d","Type":"ContainerStarted","Data":"65c94df28c1ae5d26209b3f57585b82a9ed4cf8cef648e7caab8120f9fc57407"} Apr 20 20:15:10.599572 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:10.599217 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d57f93f0-bc8e-4a80-8582-4faac193738d","Type":"ContainerStarted","Data":"60c66310ed1b938591e90780418c019ee2d6e36bd3cf0669bdd53761e2ab3b50"} Apr 20 20:15:10.599572 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:10.599226 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d57f93f0-bc8e-4a80-8582-4faac193738d","Type":"ContainerStarted","Data":"5efd7d586bfb83a22436c99ed26a000f316d14c211d9e0823961257901246351"} Apr 20 20:15:10.599572 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:10.599234 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d57f93f0-bc8e-4a80-8582-4faac193738d","Type":"ContainerStarted","Data":"9107b19c0f296ae28c7b3f794d5a2aa2f54f23704452c5896e85686484ab8d8b"} Apr 20 20:15:10.599572 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:10.599242 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d57f93f0-bc8e-4a80-8582-4faac193738d","Type":"ContainerStarted","Data":"fe266246c47c794f6f9976339f94013b0d2daebb133a763e59778eee9b7d69f8"} Apr 20 20:15:10.625209 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:10.625164 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.62515007 podStartE2EDuration="2.62515007s" podCreationTimestamp="2026-04-20 20:15:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:15:10.6240732 +0000 UTC m=+242.379443731" watchObservedRunningTime="2026-04-20 20:15:10.62515007 +0000 UTC m=+242.380520600" Apr 20 20:15:13.761283 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:13.761246 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:13.761283 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:13.761290 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:13.765868 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:13.765847 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:14.613794 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:14.613762 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79b5459745-gr874" Apr 20 20:15:19.510756 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:19.510719 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs\") pod \"network-metrics-daemon-z9tzr\" (UID: \"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9\") " pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:15:19.513334 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:19.513307 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9-metrics-certs\") pod \"network-metrics-daemon-z9tzr\" (UID: \"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9\") " pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:15:19.624107 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:19.624079 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-779fg\"" Apr 20 20:15:19.631422 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:19.631402 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9tzr" Apr 20 20:15:19.746645 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:19.746616 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z9tzr"] Apr 20 20:15:19.748307 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:15:19.748284 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac49f1d0_6c1e_4394_8c2b_7f5c9cac6ed9.slice/crio-c6763cc7cb0996383c709a5f2692dbab7ce9d2d999d7ac6fb94e76602a4162ce WatchSource:0}: Error finding container c6763cc7cb0996383c709a5f2692dbab7ce9d2d999d7ac6fb94e76602a4162ce: Status 404 returned error can't find the container with id c6763cc7cb0996383c709a5f2692dbab7ce9d2d999d7ac6fb94e76602a4162ce Apr 20 20:15:20.626678 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:20.626631 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z9tzr" event={"ID":"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9","Type":"ContainerStarted","Data":"c6763cc7cb0996383c709a5f2692dbab7ce9d2d999d7ac6fb94e76602a4162ce"} Apr 20 20:15:21.631095 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:21.631056 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z9tzr" event={"ID":"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9","Type":"ContainerStarted","Data":"df283ff161d147345c256258e8d737d7e7beb0d5b662878db66af86c80bf096e"} Apr 20 20:15:21.631095 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:21.631097 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z9tzr" event={"ID":"ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9","Type":"ContainerStarted","Data":"dc12674a6628f66d7ff278f6bb8c11c6e17930ccdeaa6c460f36a21329414fad"} Apr 20 20:15:21.646810 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:15:21.646761 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-z9tzr" podStartSLOduration=252.607080717 podStartE2EDuration="4m13.646743289s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:15:19.750474738 +0000 UTC m=+251.505845248" lastFinishedPulling="2026-04-20 20:15:20.790137299 +0000 UTC m=+252.545507820" observedRunningTime="2026-04-20 20:15:21.645781103 +0000 UTC m=+253.401151638" watchObservedRunningTime="2026-04-20 20:15:21.646743289 +0000 UTC m=+253.402113820" Apr 20 20:16:08.709803 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:16:08.709774 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:16:08.710525 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:16:08.710501 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:16:08.714072 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:16:08.714053 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 20:17:11.011397 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.011360 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86"] Apr 20 20:17:11.014601 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.014582 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86" Apr 20 20:17:11.018302 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.018268 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 20:17:11.018469 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.018318 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 20:17:11.018469 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.018370 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-pq7n9\"" Apr 20 20:17:11.018469 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.018380 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 20:17:11.019454 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.019440 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 20:17:11.031330 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.031299 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86"] Apr 20 20:17:11.042135 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.042089 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ad36871-29bc-44b2-b2f7-bc5a2763a8fc-webhook-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-9tp86\" (UID: \"4ad36871-29bc-44b2-b2f7-bc5a2763a8fc\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86" Apr 20 20:17:11.042135 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.042130 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdr5s\" (UniqueName: \"kubernetes.io/projected/4ad36871-29bc-44b2-b2f7-bc5a2763a8fc-kube-api-access-pdr5s\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-9tp86\" (UID: \"4ad36871-29bc-44b2-b2f7-bc5a2763a8fc\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86" Apr 20 20:17:11.042407 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.042228 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ad36871-29bc-44b2-b2f7-bc5a2763a8fc-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-9tp86\" (UID: \"4ad36871-29bc-44b2-b2f7-bc5a2763a8fc\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86" Apr 20 20:17:11.142931 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.142891 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ad36871-29bc-44b2-b2f7-bc5a2763a8fc-webhook-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-9tp86\" (UID: \"4ad36871-29bc-44b2-b2f7-bc5a2763a8fc\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86" Apr 20 20:17:11.142931 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.142932 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdr5s\" (UniqueName: \"kubernetes.io/projected/4ad36871-29bc-44b2-b2f7-bc5a2763a8fc-kube-api-access-pdr5s\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-9tp86\" (UID: \"4ad36871-29bc-44b2-b2f7-bc5a2763a8fc\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86" Apr 20 20:17:11.143188 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.143012 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ad36871-29bc-44b2-b2f7-bc5a2763a8fc-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-9tp86\" (UID: \"4ad36871-29bc-44b2-b2f7-bc5a2763a8fc\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86" Apr 20 20:17:11.145720 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.145683 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ad36871-29bc-44b2-b2f7-bc5a2763a8fc-webhook-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-9tp86\" (UID: \"4ad36871-29bc-44b2-b2f7-bc5a2763a8fc\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86" Apr 20 20:17:11.145720 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.145716 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ad36871-29bc-44b2-b2f7-bc5a2763a8fc-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-9tp86\" (UID: \"4ad36871-29bc-44b2-b2f7-bc5a2763a8fc\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86" Apr 20 20:17:11.152965 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.152913 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdr5s\" (UniqueName: \"kubernetes.io/projected/4ad36871-29bc-44b2-b2f7-bc5a2763a8fc-kube-api-access-pdr5s\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-9tp86\" (UID: \"4ad36871-29bc-44b2-b2f7-bc5a2763a8fc\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86" Apr 20 20:17:11.326171 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.326072 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86" Apr 20 20:17:11.477117 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.477051 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86"] Apr 20 20:17:11.479665 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:17:11.479636 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ad36871_29bc_44b2_b2f7_bc5a2763a8fc.slice/crio-9ab356112240dec9463bf22e8df4e94a6e4da92f6194b47ced03d889171df3f4 WatchSource:0}: Error finding container 9ab356112240dec9463bf22e8df4e94a6e4da92f6194b47ced03d889171df3f4: Status 404 returned error can't find the container with id 9ab356112240dec9463bf22e8df4e94a6e4da92f6194b47ced03d889171df3f4 Apr 20 20:17:11.481558 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.481535 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:17:11.932032 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:11.931998 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86" event={"ID":"4ad36871-29bc-44b2-b2f7-bc5a2763a8fc","Type":"ContainerStarted","Data":"9ab356112240dec9463bf22e8df4e94a6e4da92f6194b47ced03d889171df3f4"} Apr 20 20:17:14.943293 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:14.943259 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86" event={"ID":"4ad36871-29bc-44b2-b2f7-bc5a2763a8fc","Type":"ContainerStarted","Data":"cc2b7bc4469b9bb323de70449d02a556385d467715ea01eef7b2c67c1aac85b0"} Apr 20 20:17:14.943693 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:14.943400 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86" Apr 20 20:17:14.971110 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:14.971052 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86" podStartSLOduration=2.410123116 podStartE2EDuration="4.971029955s" podCreationTimestamp="2026-04-20 20:17:10 +0000 UTC" firstStartedPulling="2026-04-20 20:17:11.481708069 +0000 UTC m=+363.237078577" lastFinishedPulling="2026-04-20 20:17:14.042614908 +0000 UTC m=+365.797985416" observedRunningTime="2026-04-20 20:17:14.969616855 +0000 UTC m=+366.724987412" watchObservedRunningTime="2026-04-20 20:17:14.971029955 +0000 UTC m=+366.726400488" Apr 20 20:17:16.849179 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.849128 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw"] Apr 20 20:17:16.852222 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.852197 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" Apr 20 20:17:16.855113 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.855083 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 20:17:16.856460 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.856437 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 20:17:16.856460 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.856454 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-z5t9d\"" Apr 20 20:17:16.856662 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.856437 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 20:17:16.856662 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.856470 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:17:16.856662 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.856437 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 20:17:16.861015 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.860989 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw"] Apr 20 20:17:16.893245 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.893204 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d86e9816-2979-4c19-ac99-08d1b5eab437-metrics-cert\") pod \"lws-controller-manager-54d459c768-tk8nw\" (UID: \"d86e9816-2979-4c19-ac99-08d1b5eab437\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" Apr 20 20:17:16.893463 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.893254 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99jvx\" (UniqueName: \"kubernetes.io/projected/d86e9816-2979-4c19-ac99-08d1b5eab437-kube-api-access-99jvx\") pod \"lws-controller-manager-54d459c768-tk8nw\" (UID: \"d86e9816-2979-4c19-ac99-08d1b5eab437\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" Apr 20 20:17:16.893463 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.893336 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d86e9816-2979-4c19-ac99-08d1b5eab437-cert\") pod \"lws-controller-manager-54d459c768-tk8nw\" (UID: \"d86e9816-2979-4c19-ac99-08d1b5eab437\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" Apr 20 20:17:16.893463 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.893377 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d86e9816-2979-4c19-ac99-08d1b5eab437-manager-config\") pod \"lws-controller-manager-54d459c768-tk8nw\" (UID: \"d86e9816-2979-4c19-ac99-08d1b5eab437\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" Apr 20 20:17:16.994808 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.994753 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d86e9816-2979-4c19-ac99-08d1b5eab437-metrics-cert\") pod \"lws-controller-manager-54d459c768-tk8nw\" (UID: \"d86e9816-2979-4c19-ac99-08d1b5eab437\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" Apr 20 20:17:16.995032 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.994817 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99jvx\" (UniqueName: \"kubernetes.io/projected/d86e9816-2979-4c19-ac99-08d1b5eab437-kube-api-access-99jvx\") pod \"lws-controller-manager-54d459c768-tk8nw\" (UID: \"d86e9816-2979-4c19-ac99-08d1b5eab437\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" Apr 20 20:17:16.995032 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.994852 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d86e9816-2979-4c19-ac99-08d1b5eab437-cert\") pod \"lws-controller-manager-54d459c768-tk8nw\" (UID: \"d86e9816-2979-4c19-ac99-08d1b5eab437\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" Apr 20 20:17:16.995032 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.994878 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d86e9816-2979-4c19-ac99-08d1b5eab437-manager-config\") pod \"lws-controller-manager-54d459c768-tk8nw\" (UID: \"d86e9816-2979-4c19-ac99-08d1b5eab437\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" Apr 20 20:17:16.995591 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.995565 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d86e9816-2979-4c19-ac99-08d1b5eab437-manager-config\") pod \"lws-controller-manager-54d459c768-tk8nw\" (UID: \"d86e9816-2979-4c19-ac99-08d1b5eab437\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" Apr 20 20:17:16.997557 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.997528 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d86e9816-2979-4c19-ac99-08d1b5eab437-cert\") pod \"lws-controller-manager-54d459c768-tk8nw\" (UID: \"d86e9816-2979-4c19-ac99-08d1b5eab437\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" Apr 20 20:17:16.997668 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:16.997618 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d86e9816-2979-4c19-ac99-08d1b5eab437-metrics-cert\") pod \"lws-controller-manager-54d459c768-tk8nw\" (UID: \"d86e9816-2979-4c19-ac99-08d1b5eab437\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" Apr 20 20:17:17.009995 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:17.009943 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99jvx\" (UniqueName: \"kubernetes.io/projected/d86e9816-2979-4c19-ac99-08d1b5eab437-kube-api-access-99jvx\") pod \"lws-controller-manager-54d459c768-tk8nw\" (UID: \"d86e9816-2979-4c19-ac99-08d1b5eab437\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" Apr 20 20:17:17.163451 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:17.163351 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" Apr 20 20:17:17.290474 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:17.290448 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw"] Apr 20 20:17:17.293400 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:17:17.293369 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd86e9816_2979_4c19_ac99_08d1b5eab437.slice/crio-3eb29eb78dea66290f13a4c0ca7431475b96ba633303a1d2f6d5e1c1c6a4acf6 WatchSource:0}: Error finding container 3eb29eb78dea66290f13a4c0ca7431475b96ba633303a1d2f6d5e1c1c6a4acf6: Status 404 returned error can't find the container with id 3eb29eb78dea66290f13a4c0ca7431475b96ba633303a1d2f6d5e1c1c6a4acf6 Apr 20 20:17:17.953295 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:17.953258 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" event={"ID":"d86e9816-2979-4c19-ac99-08d1b5eab437","Type":"ContainerStarted","Data":"3eb29eb78dea66290f13a4c0ca7431475b96ba633303a1d2f6d5e1c1c6a4acf6"} Apr 20 20:17:20.969521 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:20.969485 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" event={"ID":"d86e9816-2979-4c19-ac99-08d1b5eab437","Type":"ContainerStarted","Data":"6ddd09d544709776887a149e040fcee91eaa503f511e490810bce897a9e40796"} Apr 20 20:17:20.970033 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:20.969600 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" Apr 20 20:17:20.987134 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:20.987070 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" podStartSLOduration=1.446390554 podStartE2EDuration="4.987051328s" podCreationTimestamp="2026-04-20 20:17:16 +0000 UTC" firstStartedPulling="2026-04-20 20:17:17.295341986 +0000 UTC m=+369.050712493" lastFinishedPulling="2026-04-20 20:17:20.83600276 +0000 UTC m=+372.591373267" observedRunningTime="2026-04-20 20:17:20.984730281 +0000 UTC m=+372.740100810" watchObservedRunningTime="2026-04-20 20:17:20.987051328 +0000 UTC m=+372.742421925" Apr 20 20:17:25.948428 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:25.948398 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-9tp86" Apr 20 20:17:28.487650 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.487612 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-66df9c9b9f-9h5fb"] Apr 20 20:17:28.491129 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.491105 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-9h5fb" Apr 20 20:17:28.493749 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.493724 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 20:17:28.495162 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.495137 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-dxx7q\"" Apr 20 20:17:28.495306 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.495180 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 20:17:28.495306 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.495207 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 20:17:28.495306 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.495143 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 20:17:28.500475 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.500443 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-66df9c9b9f-9h5fb"] Apr 20 20:17:28.595119 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.595077 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lnxs\" (UniqueName: \"kubernetes.io/projected/38593f2b-438c-4aaf-951c-d88294b014b8-kube-api-access-2lnxs\") pod \"kube-auth-proxy-66df9c9b9f-9h5fb\" (UID: \"38593f2b-438c-4aaf-951c-d88294b014b8\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-9h5fb" Apr 20 20:17:28.595119 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.595126 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38593f2b-438c-4aaf-951c-d88294b014b8-tmp\") pod \"kube-auth-proxy-66df9c9b9f-9h5fb\" (UID: \"38593f2b-438c-4aaf-951c-d88294b014b8\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-9h5fb" Apr 20 20:17:28.595344 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.595151 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38593f2b-438c-4aaf-951c-d88294b014b8-tls-certs\") pod \"kube-auth-proxy-66df9c9b9f-9h5fb\" (UID: \"38593f2b-438c-4aaf-951c-d88294b014b8\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-9h5fb" Apr 20 20:17:28.696113 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.696066 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lnxs\" (UniqueName: \"kubernetes.io/projected/38593f2b-438c-4aaf-951c-d88294b014b8-kube-api-access-2lnxs\") pod \"kube-auth-proxy-66df9c9b9f-9h5fb\" (UID: \"38593f2b-438c-4aaf-951c-d88294b014b8\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-9h5fb" Apr 20 20:17:28.696244 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.696131 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38593f2b-438c-4aaf-951c-d88294b014b8-tmp\") pod \"kube-auth-proxy-66df9c9b9f-9h5fb\" (UID: \"38593f2b-438c-4aaf-951c-d88294b014b8\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-9h5fb" Apr 20 20:17:28.696244 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.696173 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38593f2b-438c-4aaf-951c-d88294b014b8-tls-certs\") pod \"kube-auth-proxy-66df9c9b9f-9h5fb\" (UID: \"38593f2b-438c-4aaf-951c-d88294b014b8\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-9h5fb" Apr 20 20:17:28.698654 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.698619 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38593f2b-438c-4aaf-951c-d88294b014b8-tmp\") pod \"kube-auth-proxy-66df9c9b9f-9h5fb\" (UID: \"38593f2b-438c-4aaf-951c-d88294b014b8\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-9h5fb" Apr 20 20:17:28.698891 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.698865 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38593f2b-438c-4aaf-951c-d88294b014b8-tls-certs\") pod \"kube-auth-proxy-66df9c9b9f-9h5fb\" (UID: \"38593f2b-438c-4aaf-951c-d88294b014b8\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-9h5fb" Apr 20 20:17:28.705482 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.705451 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lnxs\" (UniqueName: \"kubernetes.io/projected/38593f2b-438c-4aaf-951c-d88294b014b8-kube-api-access-2lnxs\") pod \"kube-auth-proxy-66df9c9b9f-9h5fb\" (UID: \"38593f2b-438c-4aaf-951c-d88294b014b8\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-9h5fb" Apr 20 20:17:28.803171 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.803058 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-9h5fb" Apr 20 20:17:28.935701 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.935532 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-66df9c9b9f-9h5fb"] Apr 20 20:17:28.937999 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:17:28.937935 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38593f2b_438c_4aaf_951c_d88294b014b8.slice/crio-9b95be4490f5b8a1f99d2e60533b168f1facccd0d9ad4402bdcb4080f3fd87e8 WatchSource:0}: Error finding container 9b95be4490f5b8a1f99d2e60533b168f1facccd0d9ad4402bdcb4080f3fd87e8: Status 404 returned error can't find the container with id 9b95be4490f5b8a1f99d2e60533b168f1facccd0d9ad4402bdcb4080f3fd87e8 Apr 20 20:17:28.994449 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:28.994402 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-9h5fb" event={"ID":"38593f2b-438c-4aaf-951c-d88294b014b8","Type":"ContainerStarted","Data":"9b95be4490f5b8a1f99d2e60533b168f1facccd0d9ad4402bdcb4080f3fd87e8"} Apr 20 20:17:31.975305 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:31.975272 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-54d459c768-tk8nw" Apr 20 20:17:33.009289 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:33.009253 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-9h5fb" event={"ID":"38593f2b-438c-4aaf-951c-d88294b014b8","Type":"ContainerStarted","Data":"80b86f5e1a9a866a7a2f68685874605cb2ea271bb76aede1a0baf6b559d9c0fa"} Apr 20 20:17:33.027168 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:17:33.027114 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-9h5fb" podStartSLOduration=1.262337229 podStartE2EDuration="5.027098069s" podCreationTimestamp="2026-04-20 20:17:28 +0000 UTC" firstStartedPulling="2026-04-20 20:17:28.939679688 +0000 UTC m=+380.695050200" lastFinishedPulling="2026-04-20 20:17:32.70444053 +0000 UTC m=+384.459811040" observedRunningTime="2026-04-20 20:17:33.026522589 +0000 UTC m=+384.781893119" watchObservedRunningTime="2026-04-20 20:17:33.027098069 +0000 UTC m=+384.782468599" Apr 20 20:19:12.565533 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:12.565499 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79b5459745-gr874"] Apr 20 20:19:14.816234 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:14.816198 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9mzrg"] Apr 20 20:19:14.819374 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:14.819355 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9mzrg" Apr 20 20:19:14.822002 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:14.821973 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 20 20:19:14.822165 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:14.822023 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 20:19:14.823249 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:14.823232 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9hfnq\"" Apr 20 20:19:14.823379 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:14.823287 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 20:19:14.823379 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:14.823237 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 20 20:19:14.826467 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:14.826448 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9mzrg"] Apr 20 20:19:14.872887 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:14.872864 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a23f4d3b-cada-49ca-8a22-7404ff74a485-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-9mzrg\" (UID: \"a23f4d3b-cada-49ca-8a22-7404ff74a485\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9mzrg" Apr 20 20:19:14.873007 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:14.872897 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr8nd\" (UniqueName: \"kubernetes.io/projected/a23f4d3b-cada-49ca-8a22-7404ff74a485-kube-api-access-lr8nd\") pod \"kuadrant-console-plugin-6cb54b5c86-9mzrg\" (UID: \"a23f4d3b-cada-49ca-8a22-7404ff74a485\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9mzrg" Apr 20 20:19:14.873007 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:14.872924 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a23f4d3b-cada-49ca-8a22-7404ff74a485-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-9mzrg\" (UID: \"a23f4d3b-cada-49ca-8a22-7404ff74a485\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9mzrg" Apr 20 20:19:14.973667 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:14.973641 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a23f4d3b-cada-49ca-8a22-7404ff74a485-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-9mzrg\" (UID: \"a23f4d3b-cada-49ca-8a22-7404ff74a485\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9mzrg" Apr 20 20:19:14.973785 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:14.973676 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lr8nd\" (UniqueName: \"kubernetes.io/projected/a23f4d3b-cada-49ca-8a22-7404ff74a485-kube-api-access-lr8nd\") pod \"kuadrant-console-plugin-6cb54b5c86-9mzrg\" (UID: \"a23f4d3b-cada-49ca-8a22-7404ff74a485\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9mzrg" Apr 20 20:19:14.973785 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:14.973714 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a23f4d3b-cada-49ca-8a22-7404ff74a485-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-9mzrg\" (UID: \"a23f4d3b-cada-49ca-8a22-7404ff74a485\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9mzrg" Apr 20 20:19:14.974409 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:14.974344 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a23f4d3b-cada-49ca-8a22-7404ff74a485-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-9mzrg\" (UID: \"a23f4d3b-cada-49ca-8a22-7404ff74a485\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9mzrg" Apr 20 20:19:14.976317 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:14.976291 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a23f4d3b-cada-49ca-8a22-7404ff74a485-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-9mzrg\" (UID: \"a23f4d3b-cada-49ca-8a22-7404ff74a485\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9mzrg" Apr 20 20:19:14.981853 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:14.981830 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr8nd\" (UniqueName: \"kubernetes.io/projected/a23f4d3b-cada-49ca-8a22-7404ff74a485-kube-api-access-lr8nd\") pod \"kuadrant-console-plugin-6cb54b5c86-9mzrg\" (UID: \"a23f4d3b-cada-49ca-8a22-7404ff74a485\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9mzrg" Apr 20 20:19:15.129060 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:15.128977 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9mzrg" Apr 20 20:19:15.241426 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:15.241402 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9mzrg"] Apr 20 20:19:15.243677 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:19:15.243647 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda23f4d3b_cada_49ca_8a22_7404ff74a485.slice/crio-e522b41fabdd6d0cae613b3cd708dd3bffbd4d25dd82406532fbb428ebeb9a98 WatchSource:0}: Error finding container e522b41fabdd6d0cae613b3cd708dd3bffbd4d25dd82406532fbb428ebeb9a98: Status 404 returned error can't find the container with id e522b41fabdd6d0cae613b3cd708dd3bffbd4d25dd82406532fbb428ebeb9a98 Apr 20 20:19:15.323296 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:15.323271 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9mzrg" event={"ID":"a23f4d3b-cada-49ca-8a22-7404ff74a485","Type":"ContainerStarted","Data":"e522b41fabdd6d0cae613b3cd708dd3bffbd4d25dd82406532fbb428ebeb9a98"} Apr 20 20:19:37.590086 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:37.590020 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-79b5459745-gr874" podUID="0bd5486b-d629-4252-8574-eae4ce3d45d7" containerName="console" containerID="cri-o://15d8fcd00b2bf77604722e9fc1076ebd4dd833cca1dcbada847234b4627e399d" gracePeriod=15 Apr 20 20:19:39.218001 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.217980 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79b5459745-gr874_0bd5486b-d629-4252-8574-eae4ce3d45d7/console/0.log" Apr 20 20:19:39.218256 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.218039 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79b5459745-gr874" Apr 20 20:19:39.384403 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.384340 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-service-ca\") pod \"0bd5486b-d629-4252-8574-eae4ce3d45d7\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " Apr 20 20:19:39.384403 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.384375 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-config\") pod \"0bd5486b-d629-4252-8574-eae4ce3d45d7\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " Apr 20 20:19:39.384566 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.384427 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-serving-cert\") pod \"0bd5486b-d629-4252-8574-eae4ce3d45d7\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " Apr 20 20:19:39.384566 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.384469 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-oauth-serving-cert\") pod \"0bd5486b-d629-4252-8574-eae4ce3d45d7\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " Apr 20 20:19:39.384566 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.384504 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-oauth-config\") pod \"0bd5486b-d629-4252-8574-eae4ce3d45d7\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " Apr 20 20:19:39.384566 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.384531 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-trusted-ca-bundle\") pod \"0bd5486b-d629-4252-8574-eae4ce3d45d7\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " Apr 20 20:19:39.384566 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.384559 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kknmf\" (UniqueName: \"kubernetes.io/projected/0bd5486b-d629-4252-8574-eae4ce3d45d7-kube-api-access-kknmf\") pod \"0bd5486b-d629-4252-8574-eae4ce3d45d7\" (UID: \"0bd5486b-d629-4252-8574-eae4ce3d45d7\") " Apr 20 20:19:39.384809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.384702 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-service-ca" (OuterVolumeSpecName: "service-ca") pod "0bd5486b-d629-4252-8574-eae4ce3d45d7" (UID: "0bd5486b-d629-4252-8574-eae4ce3d45d7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:19:39.384809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.384781 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-service-ca\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:19:39.384809 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.384799 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-config" (OuterVolumeSpecName: "console-config") pod "0bd5486b-d629-4252-8574-eae4ce3d45d7" (UID: "0bd5486b-d629-4252-8574-eae4ce3d45d7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:19:39.385012 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.384984 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0bd5486b-d629-4252-8574-eae4ce3d45d7" (UID: "0bd5486b-d629-4252-8574-eae4ce3d45d7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:19:39.385012 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.384989 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0bd5486b-d629-4252-8574-eae4ce3d45d7" (UID: "0bd5486b-d629-4252-8574-eae4ce3d45d7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:19:39.386702 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.386673 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0bd5486b-d629-4252-8574-eae4ce3d45d7" (UID: "0bd5486b-d629-4252-8574-eae4ce3d45d7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:19:39.386702 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.386685 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0bd5486b-d629-4252-8574-eae4ce3d45d7" (UID: "0bd5486b-d629-4252-8574-eae4ce3d45d7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:19:39.386824 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.386736 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bd5486b-d629-4252-8574-eae4ce3d45d7-kube-api-access-kknmf" (OuterVolumeSpecName: "kube-api-access-kknmf") pod "0bd5486b-d629-4252-8574-eae4ce3d45d7" (UID: "0bd5486b-d629-4252-8574-eae4ce3d45d7"). InnerVolumeSpecName "kube-api-access-kknmf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:19:39.408210 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.408192 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79b5459745-gr874_0bd5486b-d629-4252-8574-eae4ce3d45d7/console/0.log" Apr 20 20:19:39.408309 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.408231 2580 generic.go:358] "Generic (PLEG): container finished" podID="0bd5486b-d629-4252-8574-eae4ce3d45d7" containerID="15d8fcd00b2bf77604722e9fc1076ebd4dd833cca1dcbada847234b4627e399d" exitCode=2 Apr 20 20:19:39.408309 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.408294 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79b5459745-gr874" Apr 20 20:19:39.408415 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.408314 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79b5459745-gr874" event={"ID":"0bd5486b-d629-4252-8574-eae4ce3d45d7","Type":"ContainerDied","Data":"15d8fcd00b2bf77604722e9fc1076ebd4dd833cca1dcbada847234b4627e399d"} Apr 20 20:19:39.408415 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.408348 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79b5459745-gr874" event={"ID":"0bd5486b-d629-4252-8574-eae4ce3d45d7","Type":"ContainerDied","Data":"25a467149d982b6561a087751a2bc5477b656efb7104d9cc99f00a68e6d4a552"} Apr 20 20:19:39.408415 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.408370 2580 scope.go:117] "RemoveContainer" containerID="15d8fcd00b2bf77604722e9fc1076ebd4dd833cca1dcbada847234b4627e399d" Apr 20 20:19:39.409806 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.409784 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9mzrg" event={"ID":"a23f4d3b-cada-49ca-8a22-7404ff74a485","Type":"ContainerStarted","Data":"16b249e536c4c2eb69d2e36eae8aa0d794cc225df459e0cd5056ab35291124c8"} Apr 20 20:19:39.416470 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.416456 2580 scope.go:117] "RemoveContainer" containerID="15d8fcd00b2bf77604722e9fc1076ebd4dd833cca1dcbada847234b4627e399d" Apr 20 20:19:39.416688 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:19:39.416670 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15d8fcd00b2bf77604722e9fc1076ebd4dd833cca1dcbada847234b4627e399d\": container with ID starting with 15d8fcd00b2bf77604722e9fc1076ebd4dd833cca1dcbada847234b4627e399d not found: ID does not exist" containerID="15d8fcd00b2bf77604722e9fc1076ebd4dd833cca1dcbada847234b4627e399d" Apr 20 20:19:39.416737 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.416696 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15d8fcd00b2bf77604722e9fc1076ebd4dd833cca1dcbada847234b4627e399d"} err="failed to get container status \"15d8fcd00b2bf77604722e9fc1076ebd4dd833cca1dcbada847234b4627e399d\": rpc error: code = NotFound desc = could not find container \"15d8fcd00b2bf77604722e9fc1076ebd4dd833cca1dcbada847234b4627e399d\": container with ID starting with 15d8fcd00b2bf77604722e9fc1076ebd4dd833cca1dcbada847234b4627e399d not found: ID does not exist" Apr 20 20:19:39.431368 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.431319 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9mzrg" podStartSLOduration=1.514877 podStartE2EDuration="25.43130575s" podCreationTimestamp="2026-04-20 20:19:14 +0000 UTC" firstStartedPulling="2026-04-20 20:19:15.244904814 +0000 UTC m=+487.000275323" lastFinishedPulling="2026-04-20 20:19:39.161333565 +0000 UTC m=+510.916704073" observedRunningTime="2026-04-20 20:19:39.430419609 +0000 UTC m=+511.185790138" watchObservedRunningTime="2026-04-20 20:19:39.43130575 +0000 UTC m=+511.186676282" Apr 20 20:19:39.445878 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.445856 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79b5459745-gr874"] Apr 20 20:19:39.456821 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.453620 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-79b5459745-gr874"] Apr 20 20:19:39.485710 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.485682 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-serving-cert\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:19:39.485830 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.485717 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-oauth-serving-cert\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:19:39.485830 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.485735 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-oauth-config\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:19:39.485830 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.485749 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-trusted-ca-bundle\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:19:39.485830 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.485763 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kknmf\" (UniqueName: \"kubernetes.io/projected/0bd5486b-d629-4252-8574-eae4ce3d45d7-kube-api-access-kknmf\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:19:39.485830 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:39.485778 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bd5486b-d629-4252-8574-eae4ce3d45d7-console-config\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:19:40.824454 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:19:40.824423 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bd5486b-d629-4252-8574-eae4ce3d45d7" path="/var/lib/kubelet/pods/0bd5486b-d629-4252-8574-eae4ce3d45d7/volumes" Apr 20 20:20:10.935380 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:10.935348 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lp7xl"] Apr 20 20:20:10.935787 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:10.935638 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bd5486b-d629-4252-8574-eae4ce3d45d7" containerName="console" Apr 20 20:20:10.935787 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:10.935651 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd5486b-d629-4252-8574-eae4ce3d45d7" containerName="console" Apr 20 20:20:10.935787 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:10.935700 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0bd5486b-d629-4252-8574-eae4ce3d45d7" containerName="console" Apr 20 20:20:11.019201 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:11.019174 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lp7xl"] Apr 20 20:20:11.019344 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:11.019281 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-lp7xl" Apr 20 20:20:11.022224 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:11.022199 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-c8b6k\"" Apr 20 20:20:11.112824 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:11.112798 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-vqhtx"] Apr 20 20:20:11.125442 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:11.125417 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq9hl\" (UniqueName: \"kubernetes.io/projected/f44fc31b-13d4-4616-bf9b-73ac2d50e5b3-kube-api-access-vq9hl\") pod \"authorino-f99f4b5cd-lp7xl\" (UID: \"f44fc31b-13d4-4616-bf9b-73ac2d50e5b3\") " pod="kuadrant-system/authorino-f99f4b5cd-lp7xl" Apr 20 20:20:11.126145 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:11.126118 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-vqhtx"] Apr 20 20:20:11.126261 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:11.126225 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-vqhtx" Apr 20 20:20:11.225964 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:11.225888 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vq9hl\" (UniqueName: \"kubernetes.io/projected/f44fc31b-13d4-4616-bf9b-73ac2d50e5b3-kube-api-access-vq9hl\") pod \"authorino-f99f4b5cd-lp7xl\" (UID: \"f44fc31b-13d4-4616-bf9b-73ac2d50e5b3\") " pod="kuadrant-system/authorino-f99f4b5cd-lp7xl" Apr 20 20:20:11.225964 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:11.225938 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2q9x\" (UniqueName: \"kubernetes.io/projected/b2c3c272-a864-41ca-a796-cb625585ac17-kube-api-access-n2q9x\") pod \"authorino-7498df8756-vqhtx\" (UID: \"b2c3c272-a864-41ca-a796-cb625585ac17\") " pod="kuadrant-system/authorino-7498df8756-vqhtx" Apr 20 20:20:11.234412 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:11.234391 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq9hl\" (UniqueName: \"kubernetes.io/projected/f44fc31b-13d4-4616-bf9b-73ac2d50e5b3-kube-api-access-vq9hl\") pod \"authorino-f99f4b5cd-lp7xl\" (UID: \"f44fc31b-13d4-4616-bf9b-73ac2d50e5b3\") " pod="kuadrant-system/authorino-f99f4b5cd-lp7xl" Apr 20 20:20:11.327132 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:11.327103 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2q9x\" (UniqueName: \"kubernetes.io/projected/b2c3c272-a864-41ca-a796-cb625585ac17-kube-api-access-n2q9x\") pod \"authorino-7498df8756-vqhtx\" (UID: \"b2c3c272-a864-41ca-a796-cb625585ac17\") " pod="kuadrant-system/authorino-7498df8756-vqhtx" Apr 20 20:20:11.328487 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:11.328469 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-lp7xl" Apr 20 20:20:11.334883 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:11.334865 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2q9x\" (UniqueName: \"kubernetes.io/projected/b2c3c272-a864-41ca-a796-cb625585ac17-kube-api-access-n2q9x\") pod \"authorino-7498df8756-vqhtx\" (UID: \"b2c3c272-a864-41ca-a796-cb625585ac17\") " pod="kuadrant-system/authorino-7498df8756-vqhtx" Apr 20 20:20:11.436800 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:11.436772 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-vqhtx" Apr 20 20:20:11.450245 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:11.450220 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lp7xl"] Apr 20 20:20:11.452634 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:20:11.452594 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf44fc31b_13d4_4616_bf9b_73ac2d50e5b3.slice/crio-be4bbba6bf87064e2b9c4270c4772da340ef9fe5953c5f7b55e0959647691277 WatchSource:0}: Error finding container be4bbba6bf87064e2b9c4270c4772da340ef9fe5953c5f7b55e0959647691277: Status 404 returned error can't find the container with id be4bbba6bf87064e2b9c4270c4772da340ef9fe5953c5f7b55e0959647691277 Apr 20 20:20:11.510918 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:11.510887 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-lp7xl" event={"ID":"f44fc31b-13d4-4616-bf9b-73ac2d50e5b3","Type":"ContainerStarted","Data":"be4bbba6bf87064e2b9c4270c4772da340ef9fe5953c5f7b55e0959647691277"} Apr 20 20:20:11.559062 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:11.559036 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-vqhtx"] Apr 20 20:20:11.561750 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:20:11.561716 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2c3c272_a864_41ca_a796_cb625585ac17.slice/crio-5f74c87059912f734c3228fc59f577709e4f3f22806cd1bfc0c699897df41ca3 WatchSource:0}: Error finding container 5f74c87059912f734c3228fc59f577709e4f3f22806cd1bfc0c699897df41ca3: Status 404 returned error can't find the container with id 5f74c87059912f734c3228fc59f577709e4f3f22806cd1bfc0c699897df41ca3 Apr 20 20:20:12.517369 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:12.517327 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-vqhtx" event={"ID":"b2c3c272-a864-41ca-a796-cb625585ac17","Type":"ContainerStarted","Data":"5f74c87059912f734c3228fc59f577709e4f3f22806cd1bfc0c699897df41ca3"} Apr 20 20:20:15.529351 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:15.529309 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-vqhtx" event={"ID":"b2c3c272-a864-41ca-a796-cb625585ac17","Type":"ContainerStarted","Data":"ad90f3a8079228e556352bcbc362f2de46db7f03b3574bf92edb59254a2105d2"} Apr 20 20:20:15.530596 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:15.530572 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-lp7xl" event={"ID":"f44fc31b-13d4-4616-bf9b-73ac2d50e5b3","Type":"ContainerStarted","Data":"20dbcbef3958852999a0d3af7e0327d0bedce081039f57f13f69bd0e24358bf1"} Apr 20 20:20:15.546419 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:15.546379 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-vqhtx" podStartSLOduration=0.895409172 podStartE2EDuration="4.54636644s" podCreationTimestamp="2026-04-20 20:20:11 +0000 UTC" firstStartedPulling="2026-04-20 20:20:11.562931996 +0000 UTC m=+543.318302506" lastFinishedPulling="2026-04-20 20:20:15.213889266 +0000 UTC m=+546.969259774" observedRunningTime="2026-04-20 20:20:15.545609076 +0000 UTC m=+547.300979606" watchObservedRunningTime="2026-04-20 20:20:15.54636644 +0000 UTC m=+547.301736967" Apr 20 20:20:15.561137 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:15.561088 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-lp7xl" podStartSLOduration=1.791012048 podStartE2EDuration="5.561071932s" podCreationTimestamp="2026-04-20 20:20:10 +0000 UTC" firstStartedPulling="2026-04-20 20:20:11.454078922 +0000 UTC m=+543.209449431" lastFinishedPulling="2026-04-20 20:20:15.224138807 +0000 UTC m=+546.979509315" observedRunningTime="2026-04-20 20:20:15.558859568 +0000 UTC m=+547.314230098" watchObservedRunningTime="2026-04-20 20:20:15.561071932 +0000 UTC m=+547.316442465" Apr 20 20:20:15.582312 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:15.582288 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lp7xl"] Apr 20 20:20:17.536373 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:17.536313 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-lp7xl" podUID="f44fc31b-13d4-4616-bf9b-73ac2d50e5b3" containerName="authorino" containerID="cri-o://20dbcbef3958852999a0d3af7e0327d0bedce081039f57f13f69bd0e24358bf1" gracePeriod=30 Apr 20 20:20:17.769869 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:17.769849 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-lp7xl" Apr 20 20:20:17.881112 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:17.881045 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq9hl\" (UniqueName: \"kubernetes.io/projected/f44fc31b-13d4-4616-bf9b-73ac2d50e5b3-kube-api-access-vq9hl\") pod \"f44fc31b-13d4-4616-bf9b-73ac2d50e5b3\" (UID: \"f44fc31b-13d4-4616-bf9b-73ac2d50e5b3\") " Apr 20 20:20:17.883257 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:17.883232 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f44fc31b-13d4-4616-bf9b-73ac2d50e5b3-kube-api-access-vq9hl" (OuterVolumeSpecName: "kube-api-access-vq9hl") pod "f44fc31b-13d4-4616-bf9b-73ac2d50e5b3" (UID: "f44fc31b-13d4-4616-bf9b-73ac2d50e5b3"). InnerVolumeSpecName "kube-api-access-vq9hl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:20:17.981716 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:17.981692 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vq9hl\" (UniqueName: \"kubernetes.io/projected/f44fc31b-13d4-4616-bf9b-73ac2d50e5b3-kube-api-access-vq9hl\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:20:18.540497 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:18.540461 2580 generic.go:358] "Generic (PLEG): container finished" podID="f44fc31b-13d4-4616-bf9b-73ac2d50e5b3" containerID="20dbcbef3958852999a0d3af7e0327d0bedce081039f57f13f69bd0e24358bf1" exitCode=0 Apr 20 20:20:18.540967 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:18.540510 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-lp7xl" Apr 20 20:20:18.540967 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:18.540551 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-lp7xl" event={"ID":"f44fc31b-13d4-4616-bf9b-73ac2d50e5b3","Type":"ContainerDied","Data":"20dbcbef3958852999a0d3af7e0327d0bedce081039f57f13f69bd0e24358bf1"} Apr 20 20:20:18.540967 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:18.540584 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-lp7xl" event={"ID":"f44fc31b-13d4-4616-bf9b-73ac2d50e5b3","Type":"ContainerDied","Data":"be4bbba6bf87064e2b9c4270c4772da340ef9fe5953c5f7b55e0959647691277"} Apr 20 20:20:18.540967 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:18.540598 2580 scope.go:117] "RemoveContainer" containerID="20dbcbef3958852999a0d3af7e0327d0bedce081039f57f13f69bd0e24358bf1" Apr 20 20:20:18.548811 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:18.548796 2580 scope.go:117] "RemoveContainer" containerID="20dbcbef3958852999a0d3af7e0327d0bedce081039f57f13f69bd0e24358bf1" Apr 20 20:20:18.549060 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:20:18.549043 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20dbcbef3958852999a0d3af7e0327d0bedce081039f57f13f69bd0e24358bf1\": container with ID starting with 20dbcbef3958852999a0d3af7e0327d0bedce081039f57f13f69bd0e24358bf1 not found: ID does not exist" containerID="20dbcbef3958852999a0d3af7e0327d0bedce081039f57f13f69bd0e24358bf1" Apr 20 20:20:18.549110 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:18.549068 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20dbcbef3958852999a0d3af7e0327d0bedce081039f57f13f69bd0e24358bf1"} err="failed to get container status \"20dbcbef3958852999a0d3af7e0327d0bedce081039f57f13f69bd0e24358bf1\": rpc error: code = NotFound desc = could not find container \"20dbcbef3958852999a0d3af7e0327d0bedce081039f57f13f69bd0e24358bf1\": container with ID starting with 20dbcbef3958852999a0d3af7e0327d0bedce081039f57f13f69bd0e24358bf1 not found: ID does not exist" Apr 20 20:20:18.560007 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:18.559987 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lp7xl"] Apr 20 20:20:18.563590 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:18.563571 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lp7xl"] Apr 20 20:20:18.825101 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:18.825046 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f44fc31b-13d4-4616-bf9b-73ac2d50e5b3" path="/var/lib/kubelet/pods/f44fc31b-13d4-4616-bf9b-73ac2d50e5b3/volumes" Apr 20 20:20:43.398319 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.398285 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-pjkck"] Apr 20 20:20:43.398689 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.398599 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f44fc31b-13d4-4616-bf9b-73ac2d50e5b3" containerName="authorino" Apr 20 20:20:43.398689 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.398611 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f44fc31b-13d4-4616-bf9b-73ac2d50e5b3" containerName="authorino" Apr 20 20:20:43.398689 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.398675 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="f44fc31b-13d4-4616-bf9b-73ac2d50e5b3" containerName="authorino" Apr 20 20:20:43.401040 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.401024 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-pjkck" Apr 20 20:20:43.407232 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.407207 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-pjkck"] Apr 20 20:20:43.467705 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.467670 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n48r\" (UniqueName: \"kubernetes.io/projected/fa707626-7eba-4190-afe4-336f16f12d1d-kube-api-access-2n48r\") pod \"authorino-8b475cf9f-pjkck\" (UID: \"fa707626-7eba-4190-afe4-336f16f12d1d\") " pod="kuadrant-system/authorino-8b475cf9f-pjkck" Apr 20 20:20:43.568922 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.568895 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2n48r\" (UniqueName: \"kubernetes.io/projected/fa707626-7eba-4190-afe4-336f16f12d1d-kube-api-access-2n48r\") pod \"authorino-8b475cf9f-pjkck\" (UID: \"fa707626-7eba-4190-afe4-336f16f12d1d\") " pod="kuadrant-system/authorino-8b475cf9f-pjkck" Apr 20 20:20:43.577263 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.577243 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n48r\" (UniqueName: \"kubernetes.io/projected/fa707626-7eba-4190-afe4-336f16f12d1d-kube-api-access-2n48r\") pod \"authorino-8b475cf9f-pjkck\" (UID: \"fa707626-7eba-4190-afe4-336f16f12d1d\") " pod="kuadrant-system/authorino-8b475cf9f-pjkck" Apr 20 20:20:43.580158 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.580134 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-pjkck"] Apr 20 20:20:43.580327 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.580316 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-pjkck" Apr 20 20:20:43.605396 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.605371 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-56fdd757f5-fkxpn"] Apr 20 20:20:43.607935 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.607919 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-fkxpn" Apr 20 20:20:43.610597 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.610576 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 20:20:43.625087 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.625066 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-fkxpn"] Apr 20 20:20:43.653884 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.653635 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-fkxpn"] Apr 20 20:20:43.654153 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:20:43.653944 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-llb6j tls-cert], unattached volumes=[], failed to process volumes=[kube-api-access-llb6j tls-cert]: context canceled" pod="kuadrant-system/authorino-56fdd757f5-fkxpn" podUID="6281c23a-900c-47a2-8807-a6a02acf7c7d" Apr 20 20:20:43.680284 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.680261 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-55b6fbfb95-b6x4f"] Apr 20 20:20:43.682403 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.682387 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-55b6fbfb95-b6x4f" Apr 20 20:20:43.692645 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.692623 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-55b6fbfb95-b6x4f"] Apr 20 20:20:43.716555 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.716531 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-pjkck"] Apr 20 20:20:43.719466 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:20:43.719441 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa707626_7eba_4190_afe4_336f16f12d1d.slice/crio-8e32e1e8d7168c53b875f955c8e05b610dacc2c95792de255b8018bbf1d57423 WatchSource:0}: Error finding container 8e32e1e8d7168c53b875f955c8e05b610dacc2c95792de255b8018bbf1d57423: Status 404 returned error can't find the container with id 8e32e1e8d7168c53b875f955c8e05b610dacc2c95792de255b8018bbf1d57423 Apr 20 20:20:43.770049 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.770027 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llb6j\" (UniqueName: \"kubernetes.io/projected/6281c23a-900c-47a2-8807-a6a02acf7c7d-kube-api-access-llb6j\") pod \"authorino-56fdd757f5-fkxpn\" (UID: \"6281c23a-900c-47a2-8807-a6a02acf7c7d\") " pod="kuadrant-system/authorino-56fdd757f5-fkxpn" Apr 20 20:20:43.770147 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.770055 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6281c23a-900c-47a2-8807-a6a02acf7c7d-tls-cert\") pod \"authorino-56fdd757f5-fkxpn\" (UID: \"6281c23a-900c-47a2-8807-a6a02acf7c7d\") " pod="kuadrant-system/authorino-56fdd757f5-fkxpn" Apr 20 20:20:43.770147 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.770123 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqp5c\" (UniqueName: \"kubernetes.io/projected/737c6379-5171-439a-b6ea-1ef4b7a97ddf-kube-api-access-kqp5c\") pod \"authorino-55b6fbfb95-b6x4f\" (UID: \"737c6379-5171-439a-b6ea-1ef4b7a97ddf\") " pod="kuadrant-system/authorino-55b6fbfb95-b6x4f" Apr 20 20:20:43.770231 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.770162 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/737c6379-5171-439a-b6ea-1ef4b7a97ddf-tls-cert\") pod \"authorino-55b6fbfb95-b6x4f\" (UID: \"737c6379-5171-439a-b6ea-1ef4b7a97ddf\") " pod="kuadrant-system/authorino-55b6fbfb95-b6x4f" Apr 20 20:20:43.871108 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.871085 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqp5c\" (UniqueName: \"kubernetes.io/projected/737c6379-5171-439a-b6ea-1ef4b7a97ddf-kube-api-access-kqp5c\") pod \"authorino-55b6fbfb95-b6x4f\" (UID: \"737c6379-5171-439a-b6ea-1ef4b7a97ddf\") " pod="kuadrant-system/authorino-55b6fbfb95-b6x4f" Apr 20 20:20:43.871195 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.871136 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/737c6379-5171-439a-b6ea-1ef4b7a97ddf-tls-cert\") pod \"authorino-55b6fbfb95-b6x4f\" (UID: \"737c6379-5171-439a-b6ea-1ef4b7a97ddf\") " pod="kuadrant-system/authorino-55b6fbfb95-b6x4f" Apr 20 20:20:43.871195 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.871156 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llb6j\" (UniqueName: \"kubernetes.io/projected/6281c23a-900c-47a2-8807-a6a02acf7c7d-kube-api-access-llb6j\") pod \"authorino-56fdd757f5-fkxpn\" (UID: \"6281c23a-900c-47a2-8807-a6a02acf7c7d\") " pod="kuadrant-system/authorino-56fdd757f5-fkxpn" Apr 20 20:20:43.871195 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.871173 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6281c23a-900c-47a2-8807-a6a02acf7c7d-tls-cert\") pod \"authorino-56fdd757f5-fkxpn\" (UID: \"6281c23a-900c-47a2-8807-a6a02acf7c7d\") " pod="kuadrant-system/authorino-56fdd757f5-fkxpn" Apr 20 20:20:43.873618 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.873591 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/737c6379-5171-439a-b6ea-1ef4b7a97ddf-tls-cert\") pod \"authorino-55b6fbfb95-b6x4f\" (UID: \"737c6379-5171-439a-b6ea-1ef4b7a97ddf\") " pod="kuadrant-system/authorino-55b6fbfb95-b6x4f" Apr 20 20:20:43.873703 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.873645 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6281c23a-900c-47a2-8807-a6a02acf7c7d-tls-cert\") pod \"authorino-56fdd757f5-fkxpn\" (UID: \"6281c23a-900c-47a2-8807-a6a02acf7c7d\") " pod="kuadrant-system/authorino-56fdd757f5-fkxpn" Apr 20 20:20:43.879767 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.879745 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llb6j\" (UniqueName: \"kubernetes.io/projected/6281c23a-900c-47a2-8807-a6a02acf7c7d-kube-api-access-llb6j\") pod \"authorino-56fdd757f5-fkxpn\" (UID: \"6281c23a-900c-47a2-8807-a6a02acf7c7d\") " pod="kuadrant-system/authorino-56fdd757f5-fkxpn" Apr 20 20:20:43.879869 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.879850 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqp5c\" (UniqueName: \"kubernetes.io/projected/737c6379-5171-439a-b6ea-1ef4b7a97ddf-kube-api-access-kqp5c\") pod \"authorino-55b6fbfb95-b6x4f\" (UID: \"737c6379-5171-439a-b6ea-1ef4b7a97ddf\") " pod="kuadrant-system/authorino-55b6fbfb95-b6x4f" Apr 20 20:20:43.993496 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:43.993474 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-55b6fbfb95-b6x4f" Apr 20 20:20:44.109013 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.108988 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-55b6fbfb95-b6x4f"] Apr 20 20:20:44.111204 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:20:44.111179 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod737c6379_5171_439a_b6ea_1ef4b7a97ddf.slice/crio-c5ca3104518d309eee1defb59b6207923a5fd98bbcc54e20e21e843606652bc6 WatchSource:0}: Error finding container c5ca3104518d309eee1defb59b6207923a5fd98bbcc54e20e21e843606652bc6: Status 404 returned error can't find the container with id c5ca3104518d309eee1defb59b6207923a5fd98bbcc54e20e21e843606652bc6 Apr 20 20:20:44.631479 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.631440 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-pjkck" event={"ID":"fa707626-7eba-4190-afe4-336f16f12d1d","Type":"ContainerStarted","Data":"62ed5336320302d111e5754a7ea2fb65675800b2542f066df02dba3d52f48f2f"} Apr 20 20:20:44.631479 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.631467 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-pjkck" podUID="fa707626-7eba-4190-afe4-336f16f12d1d" containerName="authorino" containerID="cri-o://62ed5336320302d111e5754a7ea2fb65675800b2542f066df02dba3d52f48f2f" gracePeriod=30 Apr 20 20:20:44.631977 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.631487 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-pjkck" event={"ID":"fa707626-7eba-4190-afe4-336f16f12d1d","Type":"ContainerStarted","Data":"8e32e1e8d7168c53b875f955c8e05b610dacc2c95792de255b8018bbf1d57423"} Apr 20 20:20:44.633077 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.633052 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-55b6fbfb95-b6x4f" event={"ID":"737c6379-5171-439a-b6ea-1ef4b7a97ddf","Type":"ContainerStarted","Data":"d3c173d320af315e64ecc538485e22156615f84ebe6cc8defb1a45b1364e207f"} Apr 20 20:20:44.633189 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.633091 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-55b6fbfb95-b6x4f" event={"ID":"737c6379-5171-439a-b6ea-1ef4b7a97ddf","Type":"ContainerStarted","Data":"c5ca3104518d309eee1defb59b6207923a5fd98bbcc54e20e21e843606652bc6"} Apr 20 20:20:44.633189 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.633065 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-fkxpn" Apr 20 20:20:44.639034 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.639018 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-fkxpn" Apr 20 20:20:44.645298 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.645261 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-pjkck" podStartSLOduration=1.311129789 podStartE2EDuration="1.645252s" podCreationTimestamp="2026-04-20 20:20:43 +0000 UTC" firstStartedPulling="2026-04-20 20:20:43.720767789 +0000 UTC m=+575.476138298" lastFinishedPulling="2026-04-20 20:20:44.054890002 +0000 UTC m=+575.810260509" observedRunningTime="2026-04-20 20:20:44.644688847 +0000 UTC m=+576.400059388" watchObservedRunningTime="2026-04-20 20:20:44.645252 +0000 UTC m=+576.400622528" Apr 20 20:20:44.659069 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.659036 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-55b6fbfb95-b6x4f" podStartSLOduration=1.355248097 podStartE2EDuration="1.659025661s" podCreationTimestamp="2026-04-20 20:20:43 +0000 UTC" firstStartedPulling="2026-04-20 20:20:44.113060298 +0000 UTC m=+575.868430805" lastFinishedPulling="2026-04-20 20:20:44.416837849 +0000 UTC m=+576.172208369" observedRunningTime="2026-04-20 20:20:44.658339342 +0000 UTC m=+576.413709871" watchObservedRunningTime="2026-04-20 20:20:44.659025661 +0000 UTC m=+576.414396189" Apr 20 20:20:44.682403 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.682348 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-vqhtx"] Apr 20 20:20:44.682569 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.682548 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-vqhtx" podUID="b2c3c272-a864-41ca-a796-cb625585ac17" containerName="authorino" containerID="cri-o://ad90f3a8079228e556352bcbc362f2de46db7f03b3574bf92edb59254a2105d2" gracePeriod=30 Apr 20 20:20:44.778042 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.778017 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llb6j\" (UniqueName: \"kubernetes.io/projected/6281c23a-900c-47a2-8807-a6a02acf7c7d-kube-api-access-llb6j\") pod \"6281c23a-900c-47a2-8807-a6a02acf7c7d\" (UID: \"6281c23a-900c-47a2-8807-a6a02acf7c7d\") " Apr 20 20:20:44.778147 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.778076 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6281c23a-900c-47a2-8807-a6a02acf7c7d-tls-cert\") pod \"6281c23a-900c-47a2-8807-a6a02acf7c7d\" (UID: \"6281c23a-900c-47a2-8807-a6a02acf7c7d\") " Apr 20 20:20:44.780091 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.780064 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6281c23a-900c-47a2-8807-a6a02acf7c7d-kube-api-access-llb6j" (OuterVolumeSpecName: "kube-api-access-llb6j") pod "6281c23a-900c-47a2-8807-a6a02acf7c7d" (UID: "6281c23a-900c-47a2-8807-a6a02acf7c7d"). InnerVolumeSpecName "kube-api-access-llb6j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:20:44.780091 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.780075 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6281c23a-900c-47a2-8807-a6a02acf7c7d-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "6281c23a-900c-47a2-8807-a6a02acf7c7d" (UID: "6281c23a-900c-47a2-8807-a6a02acf7c7d"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:20:44.879479 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.879453 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-llb6j\" (UniqueName: \"kubernetes.io/projected/6281c23a-900c-47a2-8807-a6a02acf7c7d-kube-api-access-llb6j\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:20:44.879479 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.879481 2580 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6281c23a-900c-47a2-8807-a6a02acf7c7d-tls-cert\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:20:44.918576 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.918556 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-pjkck" Apr 20 20:20:44.925651 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:44.925632 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-vqhtx" Apr 20 20:20:45.080924 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.080888 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2q9x\" (UniqueName: \"kubernetes.io/projected/b2c3c272-a864-41ca-a796-cb625585ac17-kube-api-access-n2q9x\") pod \"b2c3c272-a864-41ca-a796-cb625585ac17\" (UID: \"b2c3c272-a864-41ca-a796-cb625585ac17\") " Apr 20 20:20:45.081116 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.081009 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n48r\" (UniqueName: \"kubernetes.io/projected/fa707626-7eba-4190-afe4-336f16f12d1d-kube-api-access-2n48r\") pod \"fa707626-7eba-4190-afe4-336f16f12d1d\" (UID: \"fa707626-7eba-4190-afe4-336f16f12d1d\") " Apr 20 20:20:45.083227 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.083196 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa707626-7eba-4190-afe4-336f16f12d1d-kube-api-access-2n48r" (OuterVolumeSpecName: "kube-api-access-2n48r") pod "fa707626-7eba-4190-afe4-336f16f12d1d" (UID: "fa707626-7eba-4190-afe4-336f16f12d1d"). InnerVolumeSpecName "kube-api-access-2n48r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:20:45.083323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.083237 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2c3c272-a864-41ca-a796-cb625585ac17-kube-api-access-n2q9x" (OuterVolumeSpecName: "kube-api-access-n2q9x") pod "b2c3c272-a864-41ca-a796-cb625585ac17" (UID: "b2c3c272-a864-41ca-a796-cb625585ac17"). InnerVolumeSpecName "kube-api-access-n2q9x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:20:45.181896 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.181871 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n2q9x\" (UniqueName: \"kubernetes.io/projected/b2c3c272-a864-41ca-a796-cb625585ac17-kube-api-access-n2q9x\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:20:45.181896 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.181895 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2n48r\" (UniqueName: \"kubernetes.io/projected/fa707626-7eba-4190-afe4-336f16f12d1d-kube-api-access-2n48r\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:20:45.638758 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.638727 2580 generic.go:358] "Generic (PLEG): container finished" podID="fa707626-7eba-4190-afe4-336f16f12d1d" containerID="62ed5336320302d111e5754a7ea2fb65675800b2542f066df02dba3d52f48f2f" exitCode=0 Apr 20 20:20:45.639192 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.638831 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-pjkck" event={"ID":"fa707626-7eba-4190-afe4-336f16f12d1d","Type":"ContainerDied","Data":"62ed5336320302d111e5754a7ea2fb65675800b2542f066df02dba3d52f48f2f"} Apr 20 20:20:45.639192 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.638841 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-pjkck" Apr 20 20:20:45.639192 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.638865 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-pjkck" event={"ID":"fa707626-7eba-4190-afe4-336f16f12d1d","Type":"ContainerDied","Data":"8e32e1e8d7168c53b875f955c8e05b610dacc2c95792de255b8018bbf1d57423"} Apr 20 20:20:45.639192 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.638891 2580 scope.go:117] "RemoveContainer" containerID="62ed5336320302d111e5754a7ea2fb65675800b2542f066df02dba3d52f48f2f" Apr 20 20:20:45.640082 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.640061 2580 generic.go:358] "Generic (PLEG): container finished" podID="b2c3c272-a864-41ca-a796-cb625585ac17" containerID="ad90f3a8079228e556352bcbc362f2de46db7f03b3574bf92edb59254a2105d2" exitCode=0 Apr 20 20:20:45.640148 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.640101 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-vqhtx" Apr 20 20:20:45.640207 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.640141 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-vqhtx" event={"ID":"b2c3c272-a864-41ca-a796-cb625585ac17","Type":"ContainerDied","Data":"ad90f3a8079228e556352bcbc362f2de46db7f03b3574bf92edb59254a2105d2"} Apr 20 20:20:45.640207 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.640171 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-vqhtx" event={"ID":"b2c3c272-a864-41ca-a796-cb625585ac17","Type":"ContainerDied","Data":"5f74c87059912f734c3228fc59f577709e4f3f22806cd1bfc0c699897df41ca3"} Apr 20 20:20:45.640384 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.640368 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-fkxpn" Apr 20 20:20:45.648060 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.648017 2580 scope.go:117] "RemoveContainer" containerID="62ed5336320302d111e5754a7ea2fb65675800b2542f066df02dba3d52f48f2f" Apr 20 20:20:45.648292 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:20:45.648276 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62ed5336320302d111e5754a7ea2fb65675800b2542f066df02dba3d52f48f2f\": container with ID starting with 62ed5336320302d111e5754a7ea2fb65675800b2542f066df02dba3d52f48f2f not found: ID does not exist" containerID="62ed5336320302d111e5754a7ea2fb65675800b2542f066df02dba3d52f48f2f" Apr 20 20:20:45.648355 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.648298 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62ed5336320302d111e5754a7ea2fb65675800b2542f066df02dba3d52f48f2f"} err="failed to get container status \"62ed5336320302d111e5754a7ea2fb65675800b2542f066df02dba3d52f48f2f\": rpc error: code = NotFound desc = could not find container \"62ed5336320302d111e5754a7ea2fb65675800b2542f066df02dba3d52f48f2f\": container with ID starting with 62ed5336320302d111e5754a7ea2fb65675800b2542f066df02dba3d52f48f2f not found: ID does not exist" Apr 20 20:20:45.648355 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.648315 2580 scope.go:117] "RemoveContainer" containerID="ad90f3a8079228e556352bcbc362f2de46db7f03b3574bf92edb59254a2105d2" Apr 20 20:20:45.655358 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.655344 2580 scope.go:117] "RemoveContainer" containerID="ad90f3a8079228e556352bcbc362f2de46db7f03b3574bf92edb59254a2105d2" Apr 20 20:20:45.655587 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:20:45.655572 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad90f3a8079228e556352bcbc362f2de46db7f03b3574bf92edb59254a2105d2\": container with ID starting with ad90f3a8079228e556352bcbc362f2de46db7f03b3574bf92edb59254a2105d2 not found: ID does not exist" containerID="ad90f3a8079228e556352bcbc362f2de46db7f03b3574bf92edb59254a2105d2" Apr 20 20:20:45.655625 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.655594 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad90f3a8079228e556352bcbc362f2de46db7f03b3574bf92edb59254a2105d2"} err="failed to get container status \"ad90f3a8079228e556352bcbc362f2de46db7f03b3574bf92edb59254a2105d2\": rpc error: code = NotFound desc = could not find container \"ad90f3a8079228e556352bcbc362f2de46db7f03b3574bf92edb59254a2105d2\": container with ID starting with ad90f3a8079228e556352bcbc362f2de46db7f03b3574bf92edb59254a2105d2 not found: ID does not exist" Apr 20 20:20:45.668708 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.668684 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-f688956b5-pdtw9"] Apr 20 20:20:45.669122 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.669107 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa707626-7eba-4190-afe4-336f16f12d1d" containerName="authorino" Apr 20 20:20:45.669171 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.669128 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa707626-7eba-4190-afe4-336f16f12d1d" containerName="authorino" Apr 20 20:20:45.669171 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.669156 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2c3c272-a864-41ca-a796-cb625585ac17" containerName="authorino" Apr 20 20:20:45.669171 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.669165 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c3c272-a864-41ca-a796-cb625585ac17" containerName="authorino" Apr 20 20:20:45.669258 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.669239 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2c3c272-a864-41ca-a796-cb625585ac17" containerName="authorino" Apr 20 20:20:45.669258 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.669252 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa707626-7eba-4190-afe4-336f16f12d1d" containerName="authorino" Apr 20 20:20:45.672070 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.672053 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-fkxpn"] Apr 20 20:20:45.672150 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.672139 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f688956b5-pdtw9" Apr 20 20:20:45.674793 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.674772 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-ghmlq\"" Apr 20 20:20:45.681600 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.681582 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-fkxpn"] Apr 20 20:20:45.682850 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.682832 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f688956b5-pdtw9"] Apr 20 20:20:45.694852 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.694833 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-pjkck"] Apr 20 20:20:45.698479 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.698462 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-pjkck"] Apr 20 20:20:45.721558 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.721538 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-vqhtx"] Apr 20 20:20:45.729501 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.729484 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-vqhtx"] Apr 20 20:20:45.787666 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.787647 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9hvf\" (UniqueName: \"kubernetes.io/projected/c0a13edd-a14a-499a-8d9f-a82a77ecb017-kube-api-access-r9hvf\") pod \"maas-controller-f688956b5-pdtw9\" (UID: \"c0a13edd-a14a-499a-8d9f-a82a77ecb017\") " pod="opendatahub/maas-controller-f688956b5-pdtw9" Apr 20 20:20:45.812405 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.810307 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7b95459d54-69q86"] Apr 20 20:20:45.813792 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.813771 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b95459d54-69q86" Apr 20 20:20:45.821299 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.821087 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7b95459d54-69q86"] Apr 20 20:20:45.889174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.889083 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9hvf\" (UniqueName: \"kubernetes.io/projected/c0a13edd-a14a-499a-8d9f-a82a77ecb017-kube-api-access-r9hvf\") pod \"maas-controller-f688956b5-pdtw9\" (UID: \"c0a13edd-a14a-499a-8d9f-a82a77ecb017\") " pod="opendatahub/maas-controller-f688956b5-pdtw9" Apr 20 20:20:45.889174 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.889165 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52qtt\" (UniqueName: \"kubernetes.io/projected/7cef7edc-8e2f-471d-af26-0427fa0bc918-kube-api-access-52qtt\") pod \"maas-controller-7b95459d54-69q86\" (UID: \"7cef7edc-8e2f-471d-af26-0427fa0bc918\") " pod="opendatahub/maas-controller-7b95459d54-69q86" Apr 20 20:20:45.897736 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.897711 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9hvf\" (UniqueName: \"kubernetes.io/projected/c0a13edd-a14a-499a-8d9f-a82a77ecb017-kube-api-access-r9hvf\") pod \"maas-controller-f688956b5-pdtw9\" (UID: \"c0a13edd-a14a-499a-8d9f-a82a77ecb017\") " pod="opendatahub/maas-controller-f688956b5-pdtw9" Apr 20 20:20:45.982456 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.982425 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f688956b5-pdtw9" Apr 20 20:20:45.990751 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:45.990724 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52qtt\" (UniqueName: \"kubernetes.io/projected/7cef7edc-8e2f-471d-af26-0427fa0bc918-kube-api-access-52qtt\") pod \"maas-controller-7b95459d54-69q86\" (UID: \"7cef7edc-8e2f-471d-af26-0427fa0bc918\") " pod="opendatahub/maas-controller-7b95459d54-69q86" Apr 20 20:20:46.003609 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:46.003562 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52qtt\" (UniqueName: \"kubernetes.io/projected/7cef7edc-8e2f-471d-af26-0427fa0bc918-kube-api-access-52qtt\") pod \"maas-controller-7b95459d54-69q86\" (UID: \"7cef7edc-8e2f-471d-af26-0427fa0bc918\") " pod="opendatahub/maas-controller-7b95459d54-69q86" Apr 20 20:20:46.126338 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:46.126310 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b95459d54-69q86" Apr 20 20:20:46.240369 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:46.240346 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7b95459d54-69q86"] Apr 20 20:20:46.242263 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:20:46.242240 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cef7edc_8e2f_471d_af26_0427fa0bc918.slice/crio-5fb346c73301ee699595abf8b418b87ec09ced144167b15b1085bebc211b4eb4 WatchSource:0}: Error finding container 5fb346c73301ee699595abf8b418b87ec09ced144167b15b1085bebc211b4eb4: Status 404 returned error can't find the container with id 5fb346c73301ee699595abf8b418b87ec09ced144167b15b1085bebc211b4eb4 Apr 20 20:20:46.320642 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:46.320621 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f688956b5-pdtw9"] Apr 20 20:20:46.322300 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:20:46.322268 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0a13edd_a14a_499a_8d9f_a82a77ecb017.slice/crio-d1670e606e8a9cf21812a7e06e2958c518c6a72efe35bb040399037cf7fb5234 WatchSource:0}: Error finding container d1670e606e8a9cf21812a7e06e2958c518c6a72efe35bb040399037cf7fb5234: Status 404 returned error can't find the container with id d1670e606e8a9cf21812a7e06e2958c518c6a72efe35bb040399037cf7fb5234 Apr 20 20:20:46.644378 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:46.644343 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b95459d54-69q86" event={"ID":"7cef7edc-8e2f-471d-af26-0427fa0bc918","Type":"ContainerStarted","Data":"5fb346c73301ee699595abf8b418b87ec09ced144167b15b1085bebc211b4eb4"} Apr 20 20:20:46.645911 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:46.645887 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f688956b5-pdtw9" event={"ID":"c0a13edd-a14a-499a-8d9f-a82a77ecb017","Type":"ContainerStarted","Data":"d1670e606e8a9cf21812a7e06e2958c518c6a72efe35bb040399037cf7fb5234"} Apr 20 20:20:46.825330 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:46.825300 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6281c23a-900c-47a2-8807-a6a02acf7c7d" path="/var/lib/kubelet/pods/6281c23a-900c-47a2-8807-a6a02acf7c7d/volumes" Apr 20 20:20:46.825518 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:46.825506 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2c3c272-a864-41ca-a796-cb625585ac17" path="/var/lib/kubelet/pods/b2c3c272-a864-41ca-a796-cb625585ac17/volumes" Apr 20 20:20:46.825806 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:46.825794 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa707626-7eba-4190-afe4-336f16f12d1d" path="/var/lib/kubelet/pods/fa707626-7eba-4190-afe4-336f16f12d1d/volumes" Apr 20 20:20:49.662611 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:49.662567 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f688956b5-pdtw9" event={"ID":"c0a13edd-a14a-499a-8d9f-a82a77ecb017","Type":"ContainerStarted","Data":"6859f99787a31e5921c078f9fdb1d363752848eaefdad77515173e1fed7c47e7"} Apr 20 20:20:49.663468 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:49.663445 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-f688956b5-pdtw9" Apr 20 20:20:49.664746 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:49.664719 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b95459d54-69q86" event={"ID":"7cef7edc-8e2f-471d-af26-0427fa0bc918","Type":"ContainerStarted","Data":"0dee1048726506c7168ae025371c507ad2d39b949ff5805e7e72899c39d664db"} Apr 20 20:20:49.664990 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:49.664971 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7b95459d54-69q86" Apr 20 20:20:49.678670 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:49.678631 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-f688956b5-pdtw9" podStartSLOduration=1.706239649 podStartE2EDuration="4.678619791s" podCreationTimestamp="2026-04-20 20:20:45 +0000 UTC" firstStartedPulling="2026-04-20 20:20:46.323474414 +0000 UTC m=+578.078844925" lastFinishedPulling="2026-04-20 20:20:49.29585456 +0000 UTC m=+581.051225067" observedRunningTime="2026-04-20 20:20:49.676969569 +0000 UTC m=+581.432340091" watchObservedRunningTime="2026-04-20 20:20:49.678619791 +0000 UTC m=+581.433990317" Apr 20 20:20:49.690773 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:20:49.690726 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7b95459d54-69q86" podStartSLOduration=1.639027271 podStartE2EDuration="4.690714796s" podCreationTimestamp="2026-04-20 20:20:45 +0000 UTC" firstStartedPulling="2026-04-20 20:20:46.243579883 +0000 UTC m=+577.998950390" lastFinishedPulling="2026-04-20 20:20:49.295267404 +0000 UTC m=+581.050637915" observedRunningTime="2026-04-20 20:20:49.690180728 +0000 UTC m=+581.445551257" watchObservedRunningTime="2026-04-20 20:20:49.690714796 +0000 UTC m=+581.446085325" Apr 20 20:21:00.673935 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:00.673906 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7b95459d54-69q86" Apr 20 20:21:00.714025 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:00.714000 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-f688956b5-pdtw9"] Apr 20 20:21:00.714243 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:00.714221 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-f688956b5-pdtw9" podUID="c0a13edd-a14a-499a-8d9f-a82a77ecb017" containerName="manager" containerID="cri-o://6859f99787a31e5921c078f9fdb1d363752848eaefdad77515173e1fed7c47e7" gracePeriod=10 Apr 20 20:21:00.721295 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:00.721276 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-f688956b5-pdtw9" Apr 20 20:21:00.945969 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:00.945937 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f688956b5-pdtw9" Apr 20 20:21:01.014681 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.014653 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5bdbc5f8f4-b2lx7"] Apr 20 20:21:01.014978 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.014966 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0a13edd-a14a-499a-8d9f-a82a77ecb017" containerName="manager" Apr 20 20:21:01.015032 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.014979 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a13edd-a14a-499a-8d9f-a82a77ecb017" containerName="manager" Apr 20 20:21:01.015069 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.015034 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0a13edd-a14a-499a-8d9f-a82a77ecb017" containerName="manager" Apr 20 20:21:01.017855 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.017839 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5bdbc5f8f4-b2lx7" Apr 20 20:21:01.027418 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.027395 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5bdbc5f8f4-b2lx7"] Apr 20 20:21:01.103479 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.103457 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9hvf\" (UniqueName: \"kubernetes.io/projected/c0a13edd-a14a-499a-8d9f-a82a77ecb017-kube-api-access-r9hvf\") pod \"c0a13edd-a14a-499a-8d9f-a82a77ecb017\" (UID: \"c0a13edd-a14a-499a-8d9f-a82a77ecb017\") " Apr 20 20:21:01.105614 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.105589 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0a13edd-a14a-499a-8d9f-a82a77ecb017-kube-api-access-r9hvf" (OuterVolumeSpecName: "kube-api-access-r9hvf") pod "c0a13edd-a14a-499a-8d9f-a82a77ecb017" (UID: "c0a13edd-a14a-499a-8d9f-a82a77ecb017"). InnerVolumeSpecName "kube-api-access-r9hvf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:21:01.204705 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.204656 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftbrv\" (UniqueName: \"kubernetes.io/projected/a74e5889-75c5-4147-af51-e74b339716f4-kube-api-access-ftbrv\") pod \"maas-controller-5bdbc5f8f4-b2lx7\" (UID: \"a74e5889-75c5-4147-af51-e74b339716f4\") " pod="opendatahub/maas-controller-5bdbc5f8f4-b2lx7" Apr 20 20:21:01.204780 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.204708 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r9hvf\" (UniqueName: \"kubernetes.io/projected/c0a13edd-a14a-499a-8d9f-a82a77ecb017-kube-api-access-r9hvf\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:21:01.306048 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.306023 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftbrv\" (UniqueName: \"kubernetes.io/projected/a74e5889-75c5-4147-af51-e74b339716f4-kube-api-access-ftbrv\") pod \"maas-controller-5bdbc5f8f4-b2lx7\" (UID: \"a74e5889-75c5-4147-af51-e74b339716f4\") " pod="opendatahub/maas-controller-5bdbc5f8f4-b2lx7" Apr 20 20:21:01.314248 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.314224 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftbrv\" (UniqueName: \"kubernetes.io/projected/a74e5889-75c5-4147-af51-e74b339716f4-kube-api-access-ftbrv\") pod \"maas-controller-5bdbc5f8f4-b2lx7\" (UID: \"a74e5889-75c5-4147-af51-e74b339716f4\") " pod="opendatahub/maas-controller-5bdbc5f8f4-b2lx7" Apr 20 20:21:01.328510 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.328493 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5bdbc5f8f4-b2lx7" Apr 20 20:21:01.441837 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.441814 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5bdbc5f8f4-b2lx7"] Apr 20 20:21:01.444176 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:21:01.444150 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda74e5889_75c5_4147_af51_e74b339716f4.slice/crio-6e33051f66f6267397093421f5dcd457c74cb9b603be5faa80a15a16a50a5a6d WatchSource:0}: Error finding container 6e33051f66f6267397093421f5dcd457c74cb9b603be5faa80a15a16a50a5a6d: Status 404 returned error can't find the container with id 6e33051f66f6267397093421f5dcd457c74cb9b603be5faa80a15a16a50a5a6d Apr 20 20:21:01.702588 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.702554 2580 generic.go:358] "Generic (PLEG): container finished" podID="c0a13edd-a14a-499a-8d9f-a82a77ecb017" containerID="6859f99787a31e5921c078f9fdb1d363752848eaefdad77515173e1fed7c47e7" exitCode=0 Apr 20 20:21:01.703014 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.702630 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f688956b5-pdtw9" event={"ID":"c0a13edd-a14a-499a-8d9f-a82a77ecb017","Type":"ContainerDied","Data":"6859f99787a31e5921c078f9fdb1d363752848eaefdad77515173e1fed7c47e7"} Apr 20 20:21:01.703014 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.702657 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f688956b5-pdtw9" Apr 20 20:21:01.703014 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.702668 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f688956b5-pdtw9" event={"ID":"c0a13edd-a14a-499a-8d9f-a82a77ecb017","Type":"ContainerDied","Data":"d1670e606e8a9cf21812a7e06e2958c518c6a72efe35bb040399037cf7fb5234"} Apr 20 20:21:01.703014 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.702685 2580 scope.go:117] "RemoveContainer" containerID="6859f99787a31e5921c078f9fdb1d363752848eaefdad77515173e1fed7c47e7" Apr 20 20:21:01.703817 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.703796 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5bdbc5f8f4-b2lx7" event={"ID":"a74e5889-75c5-4147-af51-e74b339716f4","Type":"ContainerStarted","Data":"6e33051f66f6267397093421f5dcd457c74cb9b603be5faa80a15a16a50a5a6d"} Apr 20 20:21:01.710318 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.710303 2580 scope.go:117] "RemoveContainer" containerID="6859f99787a31e5921c078f9fdb1d363752848eaefdad77515173e1fed7c47e7" Apr 20 20:21:01.710574 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:21:01.710552 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6859f99787a31e5921c078f9fdb1d363752848eaefdad77515173e1fed7c47e7\": container with ID starting with 6859f99787a31e5921c078f9fdb1d363752848eaefdad77515173e1fed7c47e7 not found: ID does not exist" containerID="6859f99787a31e5921c078f9fdb1d363752848eaefdad77515173e1fed7c47e7" Apr 20 20:21:01.710644 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.710580 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6859f99787a31e5921c078f9fdb1d363752848eaefdad77515173e1fed7c47e7"} err="failed to get container status \"6859f99787a31e5921c078f9fdb1d363752848eaefdad77515173e1fed7c47e7\": rpc error: code = NotFound desc = could not find container \"6859f99787a31e5921c078f9fdb1d363752848eaefdad77515173e1fed7c47e7\": container with ID starting with 6859f99787a31e5921c078f9fdb1d363752848eaefdad77515173e1fed7c47e7 not found: ID does not exist" Apr 20 20:21:01.724536 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.724516 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-f688956b5-pdtw9"] Apr 20 20:21:01.728656 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:01.728637 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-f688956b5-pdtw9"] Apr 20 20:21:02.708630 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:02.708599 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5bdbc5f8f4-b2lx7" event={"ID":"a74e5889-75c5-4147-af51-e74b339716f4","Type":"ContainerStarted","Data":"8e450cb3c67a03c223ff28bdae4589677e05d04906b4391f5587926fa331e3e4"} Apr 20 20:21:02.709003 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:02.708720 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-5bdbc5f8f4-b2lx7" Apr 20 20:21:02.730251 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:02.730199 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-5bdbc5f8f4-b2lx7" podStartSLOduration=1.30719676 podStartE2EDuration="1.73018349s" podCreationTimestamp="2026-04-20 20:21:01 +0000 UTC" firstStartedPulling="2026-04-20 20:21:01.445414687 +0000 UTC m=+593.200785195" lastFinishedPulling="2026-04-20 20:21:01.868401418 +0000 UTC m=+593.623771925" observedRunningTime="2026-04-20 20:21:02.730042497 +0000 UTC m=+594.485413026" watchObservedRunningTime="2026-04-20 20:21:02.73018349 +0000 UTC m=+594.485554020" Apr 20 20:21:02.824467 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:02.824441 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0a13edd-a14a-499a-8d9f-a82a77ecb017" path="/var/lib/kubelet/pods/c0a13edd-a14a-499a-8d9f-a82a77ecb017/volumes" Apr 20 20:21:06.652622 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:06.652583 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-6c46cd47bd-65hwv"] Apr 20 20:21:06.656317 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:06.656297 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6c46cd47bd-65hwv" Apr 20 20:21:06.658983 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:06.658945 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 20:21:06.659090 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:06.658945 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 20:21:06.659090 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:06.659001 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-qp9nz\"" Apr 20 20:21:06.666363 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:06.666346 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6c46cd47bd-65hwv"] Apr 20 20:21:06.850567 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:06.850533 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d0dcd194-de3e-4feb-b635-fbac03ccaded-maas-api-tls\") pod \"maas-api-6c46cd47bd-65hwv\" (UID: \"d0dcd194-de3e-4feb-b635-fbac03ccaded\") " pod="opendatahub/maas-api-6c46cd47bd-65hwv" Apr 20 20:21:06.850722 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:06.850654 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn5nv\" (UniqueName: \"kubernetes.io/projected/d0dcd194-de3e-4feb-b635-fbac03ccaded-kube-api-access-rn5nv\") pod \"maas-api-6c46cd47bd-65hwv\" (UID: \"d0dcd194-de3e-4feb-b635-fbac03ccaded\") " pod="opendatahub/maas-api-6c46cd47bd-65hwv" Apr 20 20:21:06.951694 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:06.951601 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rn5nv\" (UniqueName: \"kubernetes.io/projected/d0dcd194-de3e-4feb-b635-fbac03ccaded-kube-api-access-rn5nv\") pod \"maas-api-6c46cd47bd-65hwv\" (UID: \"d0dcd194-de3e-4feb-b635-fbac03ccaded\") " pod="opendatahub/maas-api-6c46cd47bd-65hwv" Apr 20 20:21:06.951694 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:06.951653 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d0dcd194-de3e-4feb-b635-fbac03ccaded-maas-api-tls\") pod \"maas-api-6c46cd47bd-65hwv\" (UID: \"d0dcd194-de3e-4feb-b635-fbac03ccaded\") " pod="opendatahub/maas-api-6c46cd47bd-65hwv" Apr 20 20:21:06.954124 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:06.954099 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d0dcd194-de3e-4feb-b635-fbac03ccaded-maas-api-tls\") pod \"maas-api-6c46cd47bd-65hwv\" (UID: \"d0dcd194-de3e-4feb-b635-fbac03ccaded\") " pod="opendatahub/maas-api-6c46cd47bd-65hwv" Apr 20 20:21:06.959290 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:06.959267 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn5nv\" (UniqueName: \"kubernetes.io/projected/d0dcd194-de3e-4feb-b635-fbac03ccaded-kube-api-access-rn5nv\") pod \"maas-api-6c46cd47bd-65hwv\" (UID: \"d0dcd194-de3e-4feb-b635-fbac03ccaded\") " pod="opendatahub/maas-api-6c46cd47bd-65hwv" Apr 20 20:21:06.967131 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:06.967104 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6c46cd47bd-65hwv" Apr 20 20:21:07.088516 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:07.088493 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6c46cd47bd-65hwv"] Apr 20 20:21:07.090298 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:21:07.090269 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0dcd194_de3e_4feb_b635_fbac03ccaded.slice/crio-430145b3f0e62ca540109b9fe0803a4189d0f07ae114798dcd0fe4ae2e4847c3 WatchSource:0}: Error finding container 430145b3f0e62ca540109b9fe0803a4189d0f07ae114798dcd0fe4ae2e4847c3: Status 404 returned error can't find the container with id 430145b3f0e62ca540109b9fe0803a4189d0f07ae114798dcd0fe4ae2e4847c3 Apr 20 20:21:07.730472 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:07.730435 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6c46cd47bd-65hwv" event={"ID":"d0dcd194-de3e-4feb-b635-fbac03ccaded","Type":"ContainerStarted","Data":"430145b3f0e62ca540109b9fe0803a4189d0f07ae114798dcd0fe4ae2e4847c3"} Apr 20 20:21:08.732847 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:08.732790 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:21:08.733930 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:08.733910 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:21:08.734073 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:08.733939 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6c46cd47bd-65hwv" event={"ID":"d0dcd194-de3e-4feb-b635-fbac03ccaded","Type":"ContainerStarted","Data":"4e6da4b695caf9cd5f02c2347fcd708601e48c01fa7fe5d5d7a96739480801ad"} Apr 20 20:21:08.734197 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:08.734179 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-6c46cd47bd-65hwv" Apr 20 20:21:08.750162 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:08.750118 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-6c46cd47bd-65hwv" podStartSLOduration=1.207337178 podStartE2EDuration="2.750102292s" podCreationTimestamp="2026-04-20 20:21:06 +0000 UTC" firstStartedPulling="2026-04-20 20:21:07.091672833 +0000 UTC m=+598.847043341" lastFinishedPulling="2026-04-20 20:21:08.634437948 +0000 UTC m=+600.389808455" observedRunningTime="2026-04-20 20:21:08.748899362 +0000 UTC m=+600.504269891" watchObservedRunningTime="2026-04-20 20:21:08.750102292 +0000 UTC m=+600.505472822" Apr 20 20:21:13.721053 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:13.721022 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-5bdbc5f8f4-b2lx7" Apr 20 20:21:13.763018 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:13.762991 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7b95459d54-69q86"] Apr 20 20:21:13.763214 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:13.763180 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-7b95459d54-69q86" podUID="7cef7edc-8e2f-471d-af26-0427fa0bc918" containerName="manager" containerID="cri-o://0dee1048726506c7168ae025371c507ad2d39b949ff5805e7e72899c39d664db" gracePeriod=10 Apr 20 20:21:14.003500 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:14.003475 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b95459d54-69q86" Apr 20 20:21:14.102091 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:14.102067 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52qtt\" (UniqueName: \"kubernetes.io/projected/7cef7edc-8e2f-471d-af26-0427fa0bc918-kube-api-access-52qtt\") pod \"7cef7edc-8e2f-471d-af26-0427fa0bc918\" (UID: \"7cef7edc-8e2f-471d-af26-0427fa0bc918\") " Apr 20 20:21:14.104139 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:14.104116 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cef7edc-8e2f-471d-af26-0427fa0bc918-kube-api-access-52qtt" (OuterVolumeSpecName: "kube-api-access-52qtt") pod "7cef7edc-8e2f-471d-af26-0427fa0bc918" (UID: "7cef7edc-8e2f-471d-af26-0427fa0bc918"). InnerVolumeSpecName "kube-api-access-52qtt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:21:14.202988 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:14.202946 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-52qtt\" (UniqueName: \"kubernetes.io/projected/7cef7edc-8e2f-471d-af26-0427fa0bc918-kube-api-access-52qtt\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:21:14.742425 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:14.742404 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-6c46cd47bd-65hwv" Apr 20 20:21:14.754055 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:14.754030 2580 generic.go:358] "Generic (PLEG): container finished" podID="7cef7edc-8e2f-471d-af26-0427fa0bc918" containerID="0dee1048726506c7168ae025371c507ad2d39b949ff5805e7e72899c39d664db" exitCode=0 Apr 20 20:21:14.754161 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:14.754081 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b95459d54-69q86" Apr 20 20:21:14.754161 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:14.754109 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b95459d54-69q86" event={"ID":"7cef7edc-8e2f-471d-af26-0427fa0bc918","Type":"ContainerDied","Data":"0dee1048726506c7168ae025371c507ad2d39b949ff5805e7e72899c39d664db"} Apr 20 20:21:14.754161 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:14.754144 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b95459d54-69q86" event={"ID":"7cef7edc-8e2f-471d-af26-0427fa0bc918","Type":"ContainerDied","Data":"5fb346c73301ee699595abf8b418b87ec09ced144167b15b1085bebc211b4eb4"} Apr 20 20:21:14.754161 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:14.754160 2580 scope.go:117] "RemoveContainer" containerID="0dee1048726506c7168ae025371c507ad2d39b949ff5805e7e72899c39d664db" Apr 20 20:21:14.769124 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:14.768882 2580 scope.go:117] "RemoveContainer" containerID="0dee1048726506c7168ae025371c507ad2d39b949ff5805e7e72899c39d664db" Apr 20 20:21:14.769198 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:21:14.769170 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dee1048726506c7168ae025371c507ad2d39b949ff5805e7e72899c39d664db\": container with ID starting with 0dee1048726506c7168ae025371c507ad2d39b949ff5805e7e72899c39d664db not found: ID does not exist" containerID="0dee1048726506c7168ae025371c507ad2d39b949ff5805e7e72899c39d664db" Apr 20 20:21:14.769257 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:14.769198 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dee1048726506c7168ae025371c507ad2d39b949ff5805e7e72899c39d664db"} err="failed to get container status \"0dee1048726506c7168ae025371c507ad2d39b949ff5805e7e72899c39d664db\": rpc error: code = NotFound desc = could not find container \"0dee1048726506c7168ae025371c507ad2d39b949ff5805e7e72899c39d664db\": container with ID starting with 0dee1048726506c7168ae025371c507ad2d39b949ff5805e7e72899c39d664db not found: ID does not exist" Apr 20 20:21:14.781082 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:14.781061 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7b95459d54-69q86"] Apr 20 20:21:14.784439 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:14.784418 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-7b95459d54-69q86"] Apr 20 20:21:14.824742 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:14.824717 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cef7edc-8e2f-471d-af26-0427fa0bc918" path="/var/lib/kubelet/pods/7cef7edc-8e2f-471d-af26-0427fa0bc918/volumes" Apr 20 20:21:32.673308 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:32.673275 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7bc8759798-ccpqc"] Apr 20 20:21:32.673785 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:32.673560 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7cef7edc-8e2f-471d-af26-0427fa0bc918" containerName="manager" Apr 20 20:21:32.673785 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:32.673571 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cef7edc-8e2f-471d-af26-0427fa0bc918" containerName="manager" Apr 20 20:21:32.673785 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:32.673634 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="7cef7edc-8e2f-471d-af26-0427fa0bc918" containerName="manager" Apr 20 20:21:32.679968 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:32.679927 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7bc8759798-ccpqc" Apr 20 20:21:32.685967 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:32.685928 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7bc8759798-ccpqc"] Apr 20 20:21:32.735999 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:32.735942 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/25dd608d-02cb-486a-b792-a7a037bd8bb6-maas-api-tls\") pod \"maas-api-7bc8759798-ccpqc\" (UID: \"25dd608d-02cb-486a-b792-a7a037bd8bb6\") " pod="opendatahub/maas-api-7bc8759798-ccpqc" Apr 20 20:21:32.735999 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:32.735994 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nnhj\" (UniqueName: \"kubernetes.io/projected/25dd608d-02cb-486a-b792-a7a037bd8bb6-kube-api-access-8nnhj\") pod \"maas-api-7bc8759798-ccpqc\" (UID: \"25dd608d-02cb-486a-b792-a7a037bd8bb6\") " pod="opendatahub/maas-api-7bc8759798-ccpqc" Apr 20 20:21:32.836710 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:32.836677 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/25dd608d-02cb-486a-b792-a7a037bd8bb6-maas-api-tls\") pod \"maas-api-7bc8759798-ccpqc\" (UID: \"25dd608d-02cb-486a-b792-a7a037bd8bb6\") " pod="opendatahub/maas-api-7bc8759798-ccpqc" Apr 20 20:21:32.836710 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:32.836710 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nnhj\" (UniqueName: \"kubernetes.io/projected/25dd608d-02cb-486a-b792-a7a037bd8bb6-kube-api-access-8nnhj\") pod \"maas-api-7bc8759798-ccpqc\" (UID: \"25dd608d-02cb-486a-b792-a7a037bd8bb6\") " pod="opendatahub/maas-api-7bc8759798-ccpqc" Apr 20 20:21:32.839407 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:32.839379 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/25dd608d-02cb-486a-b792-a7a037bd8bb6-maas-api-tls\") pod \"maas-api-7bc8759798-ccpqc\" (UID: \"25dd608d-02cb-486a-b792-a7a037bd8bb6\") " pod="opendatahub/maas-api-7bc8759798-ccpqc" Apr 20 20:21:32.844346 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:32.844322 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nnhj\" (UniqueName: \"kubernetes.io/projected/25dd608d-02cb-486a-b792-a7a037bd8bb6-kube-api-access-8nnhj\") pod \"maas-api-7bc8759798-ccpqc\" (UID: \"25dd608d-02cb-486a-b792-a7a037bd8bb6\") " pod="opendatahub/maas-api-7bc8759798-ccpqc" Apr 20 20:21:32.990827 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:32.990790 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7bc8759798-ccpqc" Apr 20 20:21:33.109875 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:33.109849 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7bc8759798-ccpqc"] Apr 20 20:21:33.112304 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:21:33.112273 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25dd608d_02cb_486a_b792_a7a037bd8bb6.slice/crio-a60e33cba8e0b8ad11c0f1d3c241ea76d8b6701292d4dfe97cbf26a055c5c422 WatchSource:0}: Error finding container a60e33cba8e0b8ad11c0f1d3c241ea76d8b6701292d4dfe97cbf26a055c5c422: Status 404 returned error can't find the container with id a60e33cba8e0b8ad11c0f1d3c241ea76d8b6701292d4dfe97cbf26a055c5c422 Apr 20 20:21:33.826093 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:33.826054 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7bc8759798-ccpqc" event={"ID":"25dd608d-02cb-486a-b792-a7a037bd8bb6","Type":"ContainerStarted","Data":"a60e33cba8e0b8ad11c0f1d3c241ea76d8b6701292d4dfe97cbf26a055c5c422"} Apr 20 20:21:35.832881 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:35.832842 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7bc8759798-ccpqc" event={"ID":"25dd608d-02cb-486a-b792-a7a037bd8bb6","Type":"ContainerStarted","Data":"bf389ad22c4ed085a53764f329f00563024fe3de3871cce557c27f74749cefd6"} Apr 20 20:21:35.833357 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:35.832963 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7bc8759798-ccpqc" Apr 20 20:21:35.850807 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:35.850764 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7bc8759798-ccpqc" podStartSLOduration=2.035400288 podStartE2EDuration="3.850752597s" podCreationTimestamp="2026-04-20 20:21:32 +0000 UTC" firstStartedPulling="2026-04-20 20:21:33.113593049 +0000 UTC m=+624.868963557" lastFinishedPulling="2026-04-20 20:21:34.928945355 +0000 UTC m=+626.684315866" observedRunningTime="2026-04-20 20:21:35.848606553 +0000 UTC m=+627.603977084" watchObservedRunningTime="2026-04-20 20:21:35.850752597 +0000 UTC m=+627.606123127" Apr 20 20:21:41.841159 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:41.841129 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7bc8759798-ccpqc" Apr 20 20:21:41.881778 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:41.881749 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-6c46cd47bd-65hwv"] Apr 20 20:21:41.882049 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:41.882004 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-6c46cd47bd-65hwv" podUID="d0dcd194-de3e-4feb-b635-fbac03ccaded" containerName="maas-api" containerID="cri-o://4e6da4b695caf9cd5f02c2347fcd708601e48c01fa7fe5d5d7a96739480801ad" gracePeriod=30 Apr 20 20:21:42.125532 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:42.120496 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6c46cd47bd-65hwv" Apr 20 20:21:42.205523 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:42.205491 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d0dcd194-de3e-4feb-b635-fbac03ccaded-maas-api-tls\") pod \"d0dcd194-de3e-4feb-b635-fbac03ccaded\" (UID: \"d0dcd194-de3e-4feb-b635-fbac03ccaded\") " Apr 20 20:21:42.205712 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:42.205614 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn5nv\" (UniqueName: \"kubernetes.io/projected/d0dcd194-de3e-4feb-b635-fbac03ccaded-kube-api-access-rn5nv\") pod \"d0dcd194-de3e-4feb-b635-fbac03ccaded\" (UID: \"d0dcd194-de3e-4feb-b635-fbac03ccaded\") " Apr 20 20:21:42.207781 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:42.207757 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0dcd194-de3e-4feb-b635-fbac03ccaded-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "d0dcd194-de3e-4feb-b635-fbac03ccaded" (UID: "d0dcd194-de3e-4feb-b635-fbac03ccaded"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:21:42.207873 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:42.207815 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0dcd194-de3e-4feb-b635-fbac03ccaded-kube-api-access-rn5nv" (OuterVolumeSpecName: "kube-api-access-rn5nv") pod "d0dcd194-de3e-4feb-b635-fbac03ccaded" (UID: "d0dcd194-de3e-4feb-b635-fbac03ccaded"). InnerVolumeSpecName "kube-api-access-rn5nv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:21:42.306378 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:42.306356 2580 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d0dcd194-de3e-4feb-b635-fbac03ccaded-maas-api-tls\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:21:42.306378 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:42.306378 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rn5nv\" (UniqueName: \"kubernetes.io/projected/d0dcd194-de3e-4feb-b635-fbac03ccaded-kube-api-access-rn5nv\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:21:42.855265 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:42.855231 2580 generic.go:358] "Generic (PLEG): container finished" podID="d0dcd194-de3e-4feb-b635-fbac03ccaded" containerID="4e6da4b695caf9cd5f02c2347fcd708601e48c01fa7fe5d5d7a96739480801ad" exitCode=0 Apr 20 20:21:42.855648 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:42.855269 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6c46cd47bd-65hwv" event={"ID":"d0dcd194-de3e-4feb-b635-fbac03ccaded","Type":"ContainerDied","Data":"4e6da4b695caf9cd5f02c2347fcd708601e48c01fa7fe5d5d7a96739480801ad"} Apr 20 20:21:42.855648 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:42.855293 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6c46cd47bd-65hwv" event={"ID":"d0dcd194-de3e-4feb-b635-fbac03ccaded","Type":"ContainerDied","Data":"430145b3f0e62ca540109b9fe0803a4189d0f07ae114798dcd0fe4ae2e4847c3"} Apr 20 20:21:42.855648 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:42.855298 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6c46cd47bd-65hwv" Apr 20 20:21:42.855648 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:42.855312 2580 scope.go:117] "RemoveContainer" containerID="4e6da4b695caf9cd5f02c2347fcd708601e48c01fa7fe5d5d7a96739480801ad" Apr 20 20:21:42.863069 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:42.863053 2580 scope.go:117] "RemoveContainer" containerID="4e6da4b695caf9cd5f02c2347fcd708601e48c01fa7fe5d5d7a96739480801ad" Apr 20 20:21:42.863302 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:21:42.863286 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e6da4b695caf9cd5f02c2347fcd708601e48c01fa7fe5d5d7a96739480801ad\": container with ID starting with 4e6da4b695caf9cd5f02c2347fcd708601e48c01fa7fe5d5d7a96739480801ad not found: ID does not exist" containerID="4e6da4b695caf9cd5f02c2347fcd708601e48c01fa7fe5d5d7a96739480801ad" Apr 20 20:21:42.863388 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:42.863309 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e6da4b695caf9cd5f02c2347fcd708601e48c01fa7fe5d5d7a96739480801ad"} err="failed to get container status \"4e6da4b695caf9cd5f02c2347fcd708601e48c01fa7fe5d5d7a96739480801ad\": rpc error: code = NotFound desc = could not find container \"4e6da4b695caf9cd5f02c2347fcd708601e48c01fa7fe5d5d7a96739480801ad\": container with ID starting with 4e6da4b695caf9cd5f02c2347fcd708601e48c01fa7fe5d5d7a96739480801ad not found: ID does not exist" Apr 20 20:21:42.871226 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:42.871205 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-6c46cd47bd-65hwv"] Apr 20 20:21:42.877935 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:42.877916 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-6c46cd47bd-65hwv"] Apr 20 20:21:44.824646 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:44.824616 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0dcd194-de3e-4feb-b635-fbac03ccaded" path="/var/lib/kubelet/pods/d0dcd194-de3e-4feb-b635-fbac03ccaded/volumes" Apr 20 20:21:49.747322 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.747280 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h"] Apr 20 20:21:49.747920 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.747753 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0dcd194-de3e-4feb-b635-fbac03ccaded" containerName="maas-api" Apr 20 20:21:49.747920 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.747773 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0dcd194-de3e-4feb-b635-fbac03ccaded" containerName="maas-api" Apr 20 20:21:49.747920 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.747863 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0dcd194-de3e-4feb-b635-fbac03ccaded" containerName="maas-api" Apr 20 20:21:49.751201 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.751180 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.753819 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.753793 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-6d4f8\"" Apr 20 20:21:49.753938 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.753881 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 20:21:49.755199 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.755184 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 20 20:21:49.755243 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.755234 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 20:21:49.758911 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.758888 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h"] Apr 20 20:21:49.872659 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.872617 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54d39fd1-bb6c-4546-ae98-5861f323e31e-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.872840 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.872667 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/54d39fd1-bb6c-4546-ae98-5861f323e31e-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.872840 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.872687 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w47f\" (UniqueName: \"kubernetes.io/projected/54d39fd1-bb6c-4546-ae98-5861f323e31e-kube-api-access-7w47f\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.872840 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.872756 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/54d39fd1-bb6c-4546-ae98-5861f323e31e-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.872840 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.872812 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/54d39fd1-bb6c-4546-ae98-5861f323e31e-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.873001 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.872862 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/54d39fd1-bb6c-4546-ae98-5861f323e31e-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.973580 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.973546 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/54d39fd1-bb6c-4546-ae98-5861f323e31e-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.973781 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.973610 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/54d39fd1-bb6c-4546-ae98-5861f323e31e-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.973781 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.973646 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54d39fd1-bb6c-4546-ae98-5861f323e31e-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.973781 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.973671 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/54d39fd1-bb6c-4546-ae98-5861f323e31e-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.973781 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.973692 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7w47f\" (UniqueName: \"kubernetes.io/projected/54d39fd1-bb6c-4546-ae98-5861f323e31e-kube-api-access-7w47f\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.973781 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.973712 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/54d39fd1-bb6c-4546-ae98-5861f323e31e-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.974089 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.974063 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/54d39fd1-bb6c-4546-ae98-5861f323e31e-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.974161 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.974124 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54d39fd1-bb6c-4546-ae98-5861f323e31e-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.974208 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.974148 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/54d39fd1-bb6c-4546-ae98-5861f323e31e-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.976172 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.976154 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/54d39fd1-bb6c-4546-ae98-5861f323e31e-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.976337 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.976321 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/54d39fd1-bb6c-4546-ae98-5861f323e31e-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:49.982412 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:49.982387 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w47f\" (UniqueName: \"kubernetes.io/projected/54d39fd1-bb6c-4546-ae98-5861f323e31e-kube-api-access-7w47f\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h\" (UID: \"54d39fd1-bb6c-4546-ae98-5861f323e31e\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:50.063196 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:50.063100 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:21:50.185989 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:50.185945 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h"] Apr 20 20:21:50.187630 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:21:50.187600 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54d39fd1_bb6c_4546_ae98_5861f323e31e.slice/crio-7b7fe7fe6b2970773a832e99ca3b01ed904105d1488810dc811116b8808cbe1a WatchSource:0}: Error finding container 7b7fe7fe6b2970773a832e99ca3b01ed904105d1488810dc811116b8808cbe1a: Status 404 returned error can't find the container with id 7b7fe7fe6b2970773a832e99ca3b01ed904105d1488810dc811116b8808cbe1a Apr 20 20:21:50.891000 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:50.890964 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" event={"ID":"54d39fd1-bb6c-4546-ae98-5861f323e31e","Type":"ContainerStarted","Data":"7b7fe7fe6b2970773a832e99ca3b01ed904105d1488810dc811116b8808cbe1a"} Apr 20 20:21:55.910694 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:21:55.910656 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" event={"ID":"54d39fd1-bb6c-4546-ae98-5861f323e31e","Type":"ContainerStarted","Data":"ce5931b65b5bf9218f2f3ccf9f547af3c640215ce558096ab36f85df0dd92a45"} Apr 20 20:22:01.930715 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:01.930675 2580 generic.go:358] "Generic (PLEG): container finished" podID="54d39fd1-bb6c-4546-ae98-5861f323e31e" containerID="ce5931b65b5bf9218f2f3ccf9f547af3c640215ce558096ab36f85df0dd92a45" exitCode=0 Apr 20 20:22:01.931143 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:01.930730 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" event={"ID":"54d39fd1-bb6c-4546-ae98-5861f323e31e","Type":"ContainerDied","Data":"ce5931b65b5bf9218f2f3ccf9f547af3c640215ce558096ab36f85df0dd92a45"} Apr 20 20:22:03.939549 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:03.939515 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" event={"ID":"54d39fd1-bb6c-4546-ae98-5861f323e31e","Type":"ContainerStarted","Data":"a10f001968536523fb955b9bddb23043132f0fe13ba90d2b591e6a976ed7cc72"} Apr 20 20:22:03.939914 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:03.939723 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:22:03.958394 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:03.958353 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" podStartSLOduration=2.131587757 podStartE2EDuration="14.9583392s" podCreationTimestamp="2026-04-20 20:21:49 +0000 UTC" firstStartedPulling="2026-04-20 20:21:50.18931632 +0000 UTC m=+641.944686828" lastFinishedPulling="2026-04-20 20:22:03.016067745 +0000 UTC m=+654.771438271" observedRunningTime="2026-04-20 20:22:03.956254337 +0000 UTC m=+655.711624869" watchObservedRunningTime="2026-04-20 20:22:03.9583392 +0000 UTC m=+655.713709729" Apr 20 20:22:14.956680 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:14.956647 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h" Apr 20 20:22:58.351372 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.351343 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm"] Apr 20 20:22:58.353782 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.353766 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.356302 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.356283 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 20 20:22:58.363050 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.363028 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm"] Apr 20 20:22:58.414022 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.413987 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.414170 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.414033 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.414170 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.414133 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.414293 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.414224 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.414293 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.414259 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thk6w\" (UniqueName: \"kubernetes.io/projected/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-kube-api-access-thk6w\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.414362 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.414317 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.515080 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.515051 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.515080 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.515086 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.515287 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.515110 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.515287 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.515147 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.515287 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.515214 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.515287 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.515240 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thk6w\" (UniqueName: \"kubernetes.io/projected/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-kube-api-access-thk6w\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.515560 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.515534 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.515698 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.515560 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.515698 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.515625 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.517585 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.517562 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.517815 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.517799 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.522471 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.522454 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thk6w\" (UniqueName: \"kubernetes.io/projected/2d9ba340-b926-43b0-a689-3f3c3f7a1abc-kube-api-access-thk6w\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm\" (UID: \"2d9ba340-b926-43b0-a689-3f3c3f7a1abc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.665267 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.665201 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:22:58.785462 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.785438 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm"] Apr 20 20:22:58.787495 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:22:58.787465 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d9ba340_b926_43b0_a689_3f3c3f7a1abc.slice/crio-20f026bebb0dc4297532af462821141f085e939269e5a66dbc70cfb48af83fd2 WatchSource:0}: Error finding container 20f026bebb0dc4297532af462821141f085e939269e5a66dbc70cfb48af83fd2: Status 404 returned error can't find the container with id 20f026bebb0dc4297532af462821141f085e939269e5a66dbc70cfb48af83fd2 Apr 20 20:22:58.789207 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:58.789187 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:22:59.123007 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:59.122936 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" event={"ID":"2d9ba340-b926-43b0-a689-3f3c3f7a1abc","Type":"ContainerStarted","Data":"f0b3af118db1500e219cbeff3a68cae9c6ef7be3f665600d0ed171c4a177d125"} Apr 20 20:22:59.123149 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:22:59.123014 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" event={"ID":"2d9ba340-b926-43b0-a689-3f3c3f7a1abc","Type":"ContainerStarted","Data":"20f026bebb0dc4297532af462821141f085e939269e5a66dbc70cfb48af83fd2"} Apr 20 20:23:04.140081 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:04.140050 2580 generic.go:358] "Generic (PLEG): container finished" podID="2d9ba340-b926-43b0-a689-3f3c3f7a1abc" containerID="f0b3af118db1500e219cbeff3a68cae9c6ef7be3f665600d0ed171c4a177d125" exitCode=0 Apr 20 20:23:04.140368 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:04.140123 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" event={"ID":"2d9ba340-b926-43b0-a689-3f3c3f7a1abc","Type":"ContainerDied","Data":"f0b3af118db1500e219cbeff3a68cae9c6ef7be3f665600d0ed171c4a177d125"} Apr 20 20:23:05.145076 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:05.145039 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" event={"ID":"2d9ba340-b926-43b0-a689-3f3c3f7a1abc","Type":"ContainerStarted","Data":"9e38b4af9c8914543b89ec03f545218497fac5813bb079d8a7908017c818f726"} Apr 20 20:23:05.145506 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:05.145264 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:23:05.163387 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:05.163343 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" podStartSLOduration=6.974479669 podStartE2EDuration="7.16332928s" podCreationTimestamp="2026-04-20 20:22:58 +0000 UTC" firstStartedPulling="2026-04-20 20:23:04.140712712 +0000 UTC m=+715.896083220" lastFinishedPulling="2026-04-20 20:23:04.329562322 +0000 UTC m=+716.084932831" observedRunningTime="2026-04-20 20:23:05.16179892 +0000 UTC m=+716.917169462" watchObservedRunningTime="2026-04-20 20:23:05.16332928 +0000 UTC m=+716.918699809" Apr 20 20:23:16.161029 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:16.161000 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm" Apr 20 20:23:26.138197 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:26.138161 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5d4fff88bb-625wj"] Apr 20 20:23:26.140352 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:26.140333 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5d4fff88bb-625wj" Apr 20 20:23:26.148577 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:26.148557 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5d4fff88bb-625wj"] Apr 20 20:23:26.239390 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:26.239364 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ebeb173a-fa8b-44bc-8afb-e24da1b6a12f-tls-cert\") pod \"authorino-5d4fff88bb-625wj\" (UID: \"ebeb173a-fa8b-44bc-8afb-e24da1b6a12f\") " pod="kuadrant-system/authorino-5d4fff88bb-625wj" Apr 20 20:23:26.239515 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:26.239412 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gvhx\" (UniqueName: \"kubernetes.io/projected/ebeb173a-fa8b-44bc-8afb-e24da1b6a12f-kube-api-access-9gvhx\") pod \"authorino-5d4fff88bb-625wj\" (UID: \"ebeb173a-fa8b-44bc-8afb-e24da1b6a12f\") " pod="kuadrant-system/authorino-5d4fff88bb-625wj" Apr 20 20:23:26.340313 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:26.340286 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ebeb173a-fa8b-44bc-8afb-e24da1b6a12f-tls-cert\") pod \"authorino-5d4fff88bb-625wj\" (UID: \"ebeb173a-fa8b-44bc-8afb-e24da1b6a12f\") " pod="kuadrant-system/authorino-5d4fff88bb-625wj" Apr 20 20:23:26.340407 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:26.340329 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gvhx\" (UniqueName: \"kubernetes.io/projected/ebeb173a-fa8b-44bc-8afb-e24da1b6a12f-kube-api-access-9gvhx\") pod \"authorino-5d4fff88bb-625wj\" (UID: \"ebeb173a-fa8b-44bc-8afb-e24da1b6a12f\") " pod="kuadrant-system/authorino-5d4fff88bb-625wj" Apr 20 20:23:26.343033 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:26.343004 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ebeb173a-fa8b-44bc-8afb-e24da1b6a12f-tls-cert\") pod \"authorino-5d4fff88bb-625wj\" (UID: \"ebeb173a-fa8b-44bc-8afb-e24da1b6a12f\") " pod="kuadrant-system/authorino-5d4fff88bb-625wj" Apr 20 20:23:26.347524 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:26.347507 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gvhx\" (UniqueName: \"kubernetes.io/projected/ebeb173a-fa8b-44bc-8afb-e24da1b6a12f-kube-api-access-9gvhx\") pod \"authorino-5d4fff88bb-625wj\" (UID: \"ebeb173a-fa8b-44bc-8afb-e24da1b6a12f\") " pod="kuadrant-system/authorino-5d4fff88bb-625wj" Apr 20 20:23:26.450542 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:26.450489 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5d4fff88bb-625wj" Apr 20 20:23:26.570859 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:26.570835 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5d4fff88bb-625wj"] Apr 20 20:23:26.572877 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:23:26.572840 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebeb173a_fa8b_44bc_8afb_e24da1b6a12f.slice/crio-9a0d794d72adfc13267f8469a0c464b86a4aa794bba3d0aa4784c79726cadc69 WatchSource:0}: Error finding container 9a0d794d72adfc13267f8469a0c464b86a4aa794bba3d0aa4784c79726cadc69: Status 404 returned error can't find the container with id 9a0d794d72adfc13267f8469a0c464b86a4aa794bba3d0aa4784c79726cadc69 Apr 20 20:23:27.222786 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:27.222740 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5d4fff88bb-625wj" event={"ID":"ebeb173a-fa8b-44bc-8afb-e24da1b6a12f","Type":"ContainerStarted","Data":"2ea67b7d804bab412c156f7e310fcd9b8c1254fcfd214b9ac894fcccc3cdbc34"} Apr 20 20:23:27.222786 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:27.222799 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5d4fff88bb-625wj" event={"ID":"ebeb173a-fa8b-44bc-8afb-e24da1b6a12f","Type":"ContainerStarted","Data":"9a0d794d72adfc13267f8469a0c464b86a4aa794bba3d0aa4784c79726cadc69"} Apr 20 20:23:27.242393 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:27.242333 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5d4fff88bb-625wj" podStartSLOduration=0.83174575 podStartE2EDuration="1.242312408s" podCreationTimestamp="2026-04-20 20:23:26 +0000 UTC" firstStartedPulling="2026-04-20 20:23:26.57432691 +0000 UTC m=+738.329697418" lastFinishedPulling="2026-04-20 20:23:26.984893558 +0000 UTC m=+738.740264076" observedRunningTime="2026-04-20 20:23:27.241355091 +0000 UTC m=+738.996725639" watchObservedRunningTime="2026-04-20 20:23:27.242312408 +0000 UTC m=+738.997682939" Apr 20 20:23:27.271594 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:27.271551 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-55b6fbfb95-b6x4f"] Apr 20 20:23:27.271883 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:27.271846 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-55b6fbfb95-b6x4f" podUID="737c6379-5171-439a-b6ea-1ef4b7a97ddf" containerName="authorino" containerID="cri-o://d3c173d320af315e64ecc538485e22156615f84ebe6cc8defb1a45b1364e207f" gracePeriod=30 Apr 20 20:23:27.516016 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:27.515987 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-55b6fbfb95-b6x4f" Apr 20 20:23:27.654150 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:27.654120 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/737c6379-5171-439a-b6ea-1ef4b7a97ddf-tls-cert\") pod \"737c6379-5171-439a-b6ea-1ef4b7a97ddf\" (UID: \"737c6379-5171-439a-b6ea-1ef4b7a97ddf\") " Apr 20 20:23:27.654306 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:27.654160 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqp5c\" (UniqueName: \"kubernetes.io/projected/737c6379-5171-439a-b6ea-1ef4b7a97ddf-kube-api-access-kqp5c\") pod \"737c6379-5171-439a-b6ea-1ef4b7a97ddf\" (UID: \"737c6379-5171-439a-b6ea-1ef4b7a97ddf\") " Apr 20 20:23:27.656400 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:27.656362 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737c6379-5171-439a-b6ea-1ef4b7a97ddf-kube-api-access-kqp5c" (OuterVolumeSpecName: "kube-api-access-kqp5c") pod "737c6379-5171-439a-b6ea-1ef4b7a97ddf" (UID: "737c6379-5171-439a-b6ea-1ef4b7a97ddf"). InnerVolumeSpecName "kube-api-access-kqp5c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:23:27.663910 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:27.663886 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737c6379-5171-439a-b6ea-1ef4b7a97ddf-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "737c6379-5171-439a-b6ea-1ef4b7a97ddf" (UID: "737c6379-5171-439a-b6ea-1ef4b7a97ddf"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:23:27.755166 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:27.755103 2580 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/737c6379-5171-439a-b6ea-1ef4b7a97ddf-tls-cert\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:23:27.755166 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:27.755130 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kqp5c\" (UniqueName: \"kubernetes.io/projected/737c6379-5171-439a-b6ea-1ef4b7a97ddf-kube-api-access-kqp5c\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:23:28.227397 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:28.227323 2580 generic.go:358] "Generic (PLEG): container finished" podID="737c6379-5171-439a-b6ea-1ef4b7a97ddf" containerID="d3c173d320af315e64ecc538485e22156615f84ebe6cc8defb1a45b1364e207f" exitCode=0 Apr 20 20:23:28.227397 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:28.227383 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-55b6fbfb95-b6x4f" Apr 20 20:23:28.227816 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:28.227414 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-55b6fbfb95-b6x4f" event={"ID":"737c6379-5171-439a-b6ea-1ef4b7a97ddf","Type":"ContainerDied","Data":"d3c173d320af315e64ecc538485e22156615f84ebe6cc8defb1a45b1364e207f"} Apr 20 20:23:28.227816 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:28.227447 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-55b6fbfb95-b6x4f" event={"ID":"737c6379-5171-439a-b6ea-1ef4b7a97ddf","Type":"ContainerDied","Data":"c5ca3104518d309eee1defb59b6207923a5fd98bbcc54e20e21e843606652bc6"} Apr 20 20:23:28.227816 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:28.227465 2580 scope.go:117] "RemoveContainer" containerID="d3c173d320af315e64ecc538485e22156615f84ebe6cc8defb1a45b1364e207f" Apr 20 20:23:28.236051 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:28.236036 2580 scope.go:117] "RemoveContainer" containerID="d3c173d320af315e64ecc538485e22156615f84ebe6cc8defb1a45b1364e207f" Apr 20 20:23:28.236297 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:23:28.236281 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3c173d320af315e64ecc538485e22156615f84ebe6cc8defb1a45b1364e207f\": container with ID starting with d3c173d320af315e64ecc538485e22156615f84ebe6cc8defb1a45b1364e207f not found: ID does not exist" containerID="d3c173d320af315e64ecc538485e22156615f84ebe6cc8defb1a45b1364e207f" Apr 20 20:23:28.236343 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:28.236303 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3c173d320af315e64ecc538485e22156615f84ebe6cc8defb1a45b1364e207f"} err="failed to get container status \"d3c173d320af315e64ecc538485e22156615f84ebe6cc8defb1a45b1364e207f\": rpc error: code = NotFound desc = could not find container \"d3c173d320af315e64ecc538485e22156615f84ebe6cc8defb1a45b1364e207f\": container with ID starting with d3c173d320af315e64ecc538485e22156615f84ebe6cc8defb1a45b1364e207f not found: ID does not exist" Apr 20 20:23:28.247495 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:28.247475 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-55b6fbfb95-b6x4f"] Apr 20 20:23:28.250772 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:28.250749 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-55b6fbfb95-b6x4f"] Apr 20 20:23:28.824443 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:23:28.824402 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="737c6379-5171-439a-b6ea-1ef4b7a97ddf" path="/var/lib/kubelet/pods/737c6379-5171-439a-b6ea-1ef4b7a97ddf/volumes" Apr 20 20:24:49.031629 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:49.031594 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5bdbc5f8f4-b2lx7"] Apr 20 20:24:49.032136 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:49.031816 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-5bdbc5f8f4-b2lx7" podUID="a74e5889-75c5-4147-af51-e74b339716f4" containerName="manager" containerID="cri-o://8e450cb3c67a03c223ff28bdae4589677e05d04906b4391f5587926fa331e3e4" gracePeriod=10 Apr 20 20:24:49.264075 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:49.264054 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5bdbc5f8f4-b2lx7" Apr 20 20:24:49.275889 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:49.275872 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftbrv\" (UniqueName: \"kubernetes.io/projected/a74e5889-75c5-4147-af51-e74b339716f4-kube-api-access-ftbrv\") pod \"a74e5889-75c5-4147-af51-e74b339716f4\" (UID: \"a74e5889-75c5-4147-af51-e74b339716f4\") " Apr 20 20:24:49.277864 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:49.277836 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a74e5889-75c5-4147-af51-e74b339716f4-kube-api-access-ftbrv" (OuterVolumeSpecName: "kube-api-access-ftbrv") pod "a74e5889-75c5-4147-af51-e74b339716f4" (UID: "a74e5889-75c5-4147-af51-e74b339716f4"). InnerVolumeSpecName "kube-api-access-ftbrv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:24:49.377341 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:49.377320 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ftbrv\" (UniqueName: \"kubernetes.io/projected/a74e5889-75c5-4147-af51-e74b339716f4-kube-api-access-ftbrv\") on node \"ip-10-0-141-183.ec2.internal\" DevicePath \"\"" Apr 20 20:24:49.493068 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:49.493044 2580 generic.go:358] "Generic (PLEG): container finished" podID="a74e5889-75c5-4147-af51-e74b339716f4" containerID="8e450cb3c67a03c223ff28bdae4589677e05d04906b4391f5587926fa331e3e4" exitCode=0 Apr 20 20:24:49.493187 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:49.493099 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5bdbc5f8f4-b2lx7" event={"ID":"a74e5889-75c5-4147-af51-e74b339716f4","Type":"ContainerDied","Data":"8e450cb3c67a03c223ff28bdae4589677e05d04906b4391f5587926fa331e3e4"} Apr 20 20:24:49.493187 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:49.493113 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5bdbc5f8f4-b2lx7" Apr 20 20:24:49.493187 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:49.493120 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5bdbc5f8f4-b2lx7" event={"ID":"a74e5889-75c5-4147-af51-e74b339716f4","Type":"ContainerDied","Data":"6e33051f66f6267397093421f5dcd457c74cb9b603be5faa80a15a16a50a5a6d"} Apr 20 20:24:49.493187 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:49.493148 2580 scope.go:117] "RemoveContainer" containerID="8e450cb3c67a03c223ff28bdae4589677e05d04906b4391f5587926fa331e3e4" Apr 20 20:24:49.501068 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:49.501048 2580 scope.go:117] "RemoveContainer" containerID="8e450cb3c67a03c223ff28bdae4589677e05d04906b4391f5587926fa331e3e4" Apr 20 20:24:49.501319 ip-10-0-141-183 kubenswrapper[2580]: E0420 20:24:49.501301 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e450cb3c67a03c223ff28bdae4589677e05d04906b4391f5587926fa331e3e4\": container with ID starting with 8e450cb3c67a03c223ff28bdae4589677e05d04906b4391f5587926fa331e3e4 not found: ID does not exist" containerID="8e450cb3c67a03c223ff28bdae4589677e05d04906b4391f5587926fa331e3e4" Apr 20 20:24:49.501376 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:49.501328 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e450cb3c67a03c223ff28bdae4589677e05d04906b4391f5587926fa331e3e4"} err="failed to get container status \"8e450cb3c67a03c223ff28bdae4589677e05d04906b4391f5587926fa331e3e4\": rpc error: code = NotFound desc = could not find container \"8e450cb3c67a03c223ff28bdae4589677e05d04906b4391f5587926fa331e3e4\": container with ID starting with 8e450cb3c67a03c223ff28bdae4589677e05d04906b4391f5587926fa331e3e4 not found: ID does not exist" Apr 20 20:24:49.513323 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:49.513305 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5bdbc5f8f4-b2lx7"] Apr 20 20:24:49.516593 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:49.516575 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-5bdbc5f8f4-b2lx7"] Apr 20 20:24:50.211317 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:50.211285 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5bdbc5f8f4-5vgjr"] Apr 20 20:24:50.211655 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:50.211574 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a74e5889-75c5-4147-af51-e74b339716f4" containerName="manager" Apr 20 20:24:50.211655 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:50.211584 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a74e5889-75c5-4147-af51-e74b339716f4" containerName="manager" Apr 20 20:24:50.211655 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:50.211602 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="737c6379-5171-439a-b6ea-1ef4b7a97ddf" containerName="authorino" Apr 20 20:24:50.211655 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:50.211607 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="737c6379-5171-439a-b6ea-1ef4b7a97ddf" containerName="authorino" Apr 20 20:24:50.211655 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:50.211650 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="a74e5889-75c5-4147-af51-e74b339716f4" containerName="manager" Apr 20 20:24:50.211655 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:50.211658 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="737c6379-5171-439a-b6ea-1ef4b7a97ddf" containerName="authorino" Apr 20 20:24:50.214669 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:50.214648 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5bdbc5f8f4-5vgjr" Apr 20 20:24:50.217010 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:50.216992 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-ghmlq\"" Apr 20 20:24:50.222145 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:50.222126 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5bdbc5f8f4-5vgjr"] Apr 20 20:24:50.286115 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:50.286094 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b7hk\" (UniqueName: \"kubernetes.io/projected/9fb75b85-c8b8-42cc-9dec-d9a4485b6879-kube-api-access-2b7hk\") pod \"maas-controller-5bdbc5f8f4-5vgjr\" (UID: \"9fb75b85-c8b8-42cc-9dec-d9a4485b6879\") " pod="opendatahub/maas-controller-5bdbc5f8f4-5vgjr" Apr 20 20:24:50.387291 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:50.387269 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2b7hk\" (UniqueName: \"kubernetes.io/projected/9fb75b85-c8b8-42cc-9dec-d9a4485b6879-kube-api-access-2b7hk\") pod \"maas-controller-5bdbc5f8f4-5vgjr\" (UID: \"9fb75b85-c8b8-42cc-9dec-d9a4485b6879\") " pod="opendatahub/maas-controller-5bdbc5f8f4-5vgjr" Apr 20 20:24:50.395421 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:50.395400 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b7hk\" (UniqueName: \"kubernetes.io/projected/9fb75b85-c8b8-42cc-9dec-d9a4485b6879-kube-api-access-2b7hk\") pod \"maas-controller-5bdbc5f8f4-5vgjr\" (UID: \"9fb75b85-c8b8-42cc-9dec-d9a4485b6879\") " pod="opendatahub/maas-controller-5bdbc5f8f4-5vgjr" Apr 20 20:24:50.525354 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:50.525331 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5bdbc5f8f4-5vgjr" Apr 20 20:24:50.641825 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:50.641803 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5bdbc5f8f4-5vgjr"] Apr 20 20:24:50.644042 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:24:50.644005 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fb75b85_c8b8_42cc_9dec_d9a4485b6879.slice/crio-e4eb21ce91fdd2c796392209802856aaa10ecebc77a9e239231e75b2f0865a52 WatchSource:0}: Error finding container e4eb21ce91fdd2c796392209802856aaa10ecebc77a9e239231e75b2f0865a52: Status 404 returned error can't find the container with id e4eb21ce91fdd2c796392209802856aaa10ecebc77a9e239231e75b2f0865a52 Apr 20 20:24:50.825854 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:50.825791 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a74e5889-75c5-4147-af51-e74b339716f4" path="/var/lib/kubelet/pods/a74e5889-75c5-4147-af51-e74b339716f4/volumes" Apr 20 20:24:51.502072 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:51.502039 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5bdbc5f8f4-5vgjr" event={"ID":"9fb75b85-c8b8-42cc-9dec-d9a4485b6879","Type":"ContainerStarted","Data":"a80e186b27ed27e6c83b6df442584ae2d7dc12ace2ec72c53bdde13d9cf4a344"} Apr 20 20:24:51.502072 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:51.502072 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5bdbc5f8f4-5vgjr" event={"ID":"9fb75b85-c8b8-42cc-9dec-d9a4485b6879","Type":"ContainerStarted","Data":"e4eb21ce91fdd2c796392209802856aaa10ecebc77a9e239231e75b2f0865a52"} Apr 20 20:24:51.502535 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:51.502183 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-5bdbc5f8f4-5vgjr" Apr 20 20:24:51.520450 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:24:51.520405 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-5bdbc5f8f4-5vgjr" podStartSLOduration=1.02776877 podStartE2EDuration="1.520391193s" podCreationTimestamp="2026-04-20 20:24:50 +0000 UTC" firstStartedPulling="2026-04-20 20:24:50.645323066 +0000 UTC m=+822.400693573" lastFinishedPulling="2026-04-20 20:24:51.137945486 +0000 UTC m=+822.893315996" observedRunningTime="2026-04-20 20:24:51.51931262 +0000 UTC m=+823.274683152" watchObservedRunningTime="2026-04-20 20:24:51.520391193 +0000 UTC m=+823.275761721" Apr 20 20:25:02.516029 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:25:02.515995 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-5bdbc5f8f4-5vgjr" Apr 20 20:26:08.760687 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:26:08.760603 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:26:08.764457 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:26:08.764436 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:31:08.783503 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:31:08.783470 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:31:08.790123 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:31:08.790104 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:36:08.809018 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:36:08.808984 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:36:08.813343 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:36:08.813322 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:41:08.834997 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:41:08.834944 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:41:08.840230 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:41:08.840210 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:45:45.890147 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:45:45.890113 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5d4fff88bb-625wj_ebeb173a-fa8b-44bc-8afb-e24da1b6a12f/authorino/0.log" Apr 20 20:45:49.665490 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:45:49.665460 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7bc8759798-ccpqc_25dd608d-02cb-486a-b792-a7a037bd8bb6/maas-api/0.log" Apr 20 20:45:49.902495 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:45:49.902458 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-5bdbc5f8f4-5vgjr_9fb75b85-c8b8-42cc-9dec-d9a4485b6879/manager/0.log" Apr 20 20:45:50.244136 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:45:50.244106 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7f7bf89c4-9tp86_4ad36871-29bc-44b2-b2f7-bc5a2763a8fc/manager/0.log" Apr 20 20:45:51.705667 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:45:51.705633 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5d4fff88bb-625wj_ebeb173a-fa8b-44bc-8afb-e24da1b6a12f/authorino/0.log" Apr 20 20:45:52.016889 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:45:52.016855 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-9mzrg_a23f4d3b-cada-49ca-8a22-7404ff74a485/kuadrant-console-plugin/0.log" Apr 20 20:45:53.000279 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:45:53.000252 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-66df9c9b9f-9h5fb_38593f2b-438c-4aaf-951c-d88294b014b8/kube-auth-proxy/0.log" Apr 20 20:45:53.968089 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:45:53.968054 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h_54d39fd1-bb6c-4546-ae98-5861f323e31e/storage-initializer/0.log" Apr 20 20:45:53.976288 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:45:53.976259 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrfn6h_54d39fd1-bb6c-4546-ae98-5861f323e31e/main/0.log" Apr 20 20:45:54.080626 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:45:54.080600 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm_2d9ba340-b926-43b0-a689-3f3c3f7a1abc/storage-initializer/0.log" Apr 20 20:45:54.087580 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:45:54.087559 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-frrcm_2d9ba340-b926-43b0-a689-3f3c3f7a1abc/main/0.log" Apr 20 20:46:00.587835 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:00.587807 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-d2467_3c8bb7c5-7cf9-4232-8a6e-83bfb1fc0bed/global-pull-secret-syncer/0.log" Apr 20 20:46:00.740804 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:00.740774 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vrxxm_8b242afd-b68e-4166-9b12-dcd12f17eca0/konnectivity-agent/0.log" Apr 20 20:46:00.803228 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:00.803205 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-183.ec2.internal_01a299fcf64ba455a08a0abdcaf7bcdd/haproxy/0.log" Apr 20 20:46:05.178634 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:05.178600 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5d4fff88bb-625wj_ebeb173a-fa8b-44bc-8afb-e24da1b6a12f/authorino/0.log" Apr 20 20:46:05.250649 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:05.250624 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-9mzrg_a23f4d3b-cada-49ca-8a22-7404ff74a485/kuadrant-console-plugin/0.log" Apr 20 20:46:06.820509 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:06.820429 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d57f93f0-bc8e-4a80-8582-4faac193738d/alertmanager/0.log" Apr 20 20:46:06.848376 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:06.848349 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d57f93f0-bc8e-4a80-8582-4faac193738d/config-reloader/0.log" Apr 20 20:46:06.879632 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:06.879608 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d57f93f0-bc8e-4a80-8582-4faac193738d/kube-rbac-proxy-web/0.log" Apr 20 20:46:06.929015 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:06.928981 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d57f93f0-bc8e-4a80-8582-4faac193738d/kube-rbac-proxy/0.log" Apr 20 20:46:06.967775 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:06.967742 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d57f93f0-bc8e-4a80-8582-4faac193738d/kube-rbac-proxy-metric/0.log" Apr 20 20:46:07.000756 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:07.000735 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d57f93f0-bc8e-4a80-8582-4faac193738d/prom-label-proxy/0.log" Apr 20 20:46:07.027813 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:07.027794 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d57f93f0-bc8e-4a80-8582-4faac193738d/init-config-reloader/0.log" Apr 20 20:46:07.093260 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:07.093198 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-gcgh7_ec3754cb-191c-482e-bbf0-218c49b41734/kube-state-metrics/0.log" Apr 20 20:46:07.136671 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:07.136650 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-gcgh7_ec3754cb-191c-482e-bbf0-218c49b41734/kube-rbac-proxy-main/0.log" Apr 20 20:46:07.168502 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:07.168483 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-gcgh7_ec3754cb-191c-482e-bbf0-218c49b41734/kube-rbac-proxy-self/0.log" Apr 20 20:46:07.193791 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:07.193767 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5fb54d5cbb-nm9lw_8f710e3e-d3cc-474f-ba2d-1bebaec01dcf/metrics-server/0.log" Apr 20 20:46:07.244010 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:07.243992 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f7x6t_f7852481-fc7c-425e-ab75-dc92a22dd20c/node-exporter/0.log" Apr 20 20:46:07.268096 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:07.268075 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f7x6t_f7852481-fc7c-425e-ab75-dc92a22dd20c/kube-rbac-proxy/0.log" Apr 20 20:46:07.294152 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:07.294135 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f7x6t_f7852481-fc7c-425e-ab75-dc92a22dd20c/init-textfile/0.log" Apr 20 20:46:08.029112 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:08.029091 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5997bd7fc9-n72rb_505bd79a-a9c9-46d3-aea7-c25765078ece/telemeter-client/0.log" Apr 20 20:46:08.068912 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:08.068888 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5997bd7fc9-n72rb_505bd79a-a9c9-46d3-aea7-c25765078ece/reload/0.log" Apr 20 20:46:08.098078 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:08.098051 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5997bd7fc9-n72rb_505bd79a-a9c9-46d3-aea7-c25765078ece/kube-rbac-proxy/0.log" Apr 20 20:46:08.854868 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:08.854842 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:46:08.862275 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:08.862256 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:46:09.301831 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.301802 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h"] Apr 20 20:46:09.311441 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.311416 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:09.314265 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.314241 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wj6w5\"/\"openshift-service-ca.crt\"" Apr 20 20:46:09.314395 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.314269 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wj6w5\"/\"kube-root-ca.crt\"" Apr 20 20:46:09.314395 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.314302 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wj6w5\"/\"default-dockercfg-n67tm\"" Apr 20 20:46:09.314867 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.314844 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h"] Apr 20 20:46:09.371865 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.371846 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8aee1979-8a54-406e-82cf-e5b0144ed157-podres\") pod \"perf-node-gather-daemonset-j776h\" (UID: \"8aee1979-8a54-406e-82cf-e5b0144ed157\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:09.371990 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.371873 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8aee1979-8a54-406e-82cf-e5b0144ed157-sys\") pod \"perf-node-gather-daemonset-j776h\" (UID: \"8aee1979-8a54-406e-82cf-e5b0144ed157\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:09.371990 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.371894 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8aee1979-8a54-406e-82cf-e5b0144ed157-lib-modules\") pod \"perf-node-gather-daemonset-j776h\" (UID: \"8aee1979-8a54-406e-82cf-e5b0144ed157\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:09.371990 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.371971 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn765\" (UniqueName: \"kubernetes.io/projected/8aee1979-8a54-406e-82cf-e5b0144ed157-kube-api-access-kn765\") pod \"perf-node-gather-daemonset-j776h\" (UID: \"8aee1979-8a54-406e-82cf-e5b0144ed157\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:09.372103 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.372031 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8aee1979-8a54-406e-82cf-e5b0144ed157-proc\") pod \"perf-node-gather-daemonset-j776h\" (UID: \"8aee1979-8a54-406e-82cf-e5b0144ed157\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:09.472971 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.472920 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8aee1979-8a54-406e-82cf-e5b0144ed157-podres\") pod \"perf-node-gather-daemonset-j776h\" (UID: \"8aee1979-8a54-406e-82cf-e5b0144ed157\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:09.473114 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.472978 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8aee1979-8a54-406e-82cf-e5b0144ed157-sys\") pod \"perf-node-gather-daemonset-j776h\" (UID: \"8aee1979-8a54-406e-82cf-e5b0144ed157\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:09.473114 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.473014 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8aee1979-8a54-406e-82cf-e5b0144ed157-lib-modules\") pod \"perf-node-gather-daemonset-j776h\" (UID: \"8aee1979-8a54-406e-82cf-e5b0144ed157\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:09.473114 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.473044 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8aee1979-8a54-406e-82cf-e5b0144ed157-sys\") pod \"perf-node-gather-daemonset-j776h\" (UID: \"8aee1979-8a54-406e-82cf-e5b0144ed157\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:09.473114 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.473073 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kn765\" (UniqueName: \"kubernetes.io/projected/8aee1979-8a54-406e-82cf-e5b0144ed157-kube-api-access-kn765\") pod \"perf-node-gather-daemonset-j776h\" (UID: \"8aee1979-8a54-406e-82cf-e5b0144ed157\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:09.473114 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.473087 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8aee1979-8a54-406e-82cf-e5b0144ed157-podres\") pod \"perf-node-gather-daemonset-j776h\" (UID: \"8aee1979-8a54-406e-82cf-e5b0144ed157\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:09.473114 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.473113 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8aee1979-8a54-406e-82cf-e5b0144ed157-proc\") pod \"perf-node-gather-daemonset-j776h\" (UID: \"8aee1979-8a54-406e-82cf-e5b0144ed157\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:09.473345 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.473148 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8aee1979-8a54-406e-82cf-e5b0144ed157-lib-modules\") pod \"perf-node-gather-daemonset-j776h\" (UID: \"8aee1979-8a54-406e-82cf-e5b0144ed157\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:09.473345 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.473181 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8aee1979-8a54-406e-82cf-e5b0144ed157-proc\") pod \"perf-node-gather-daemonset-j776h\" (UID: \"8aee1979-8a54-406e-82cf-e5b0144ed157\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:09.482568 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.482547 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn765\" (UniqueName: \"kubernetes.io/projected/8aee1979-8a54-406e-82cf-e5b0144ed157-kube-api-access-kn765\") pod \"perf-node-gather-daemonset-j776h\" (UID: \"8aee1979-8a54-406e-82cf-e5b0144ed157\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:09.622163 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.622105 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:09.744480 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.744455 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h"] Apr 20 20:46:09.745931 ip-10-0-141-183 kubenswrapper[2580]: W0420 20:46:09.745903 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8aee1979_8a54_406e_82cf_e5b0144ed157.slice/crio-d9c8726324950f57affd11ae950445742ce893645c44c6ed1df4cd17099aa9aa WatchSource:0}: Error finding container d9c8726324950f57affd11ae950445742ce893645c44c6ed1df4cd17099aa9aa: Status 404 returned error can't find the container with id d9c8726324950f57affd11ae950445742ce893645c44c6ed1df4cd17099aa9aa Apr 20 20:46:09.747486 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:09.747472 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:46:10.671985 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:10.671935 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" event={"ID":"8aee1979-8a54-406e-82cf-e5b0144ed157","Type":"ContainerStarted","Data":"8c4b0f46330f474e59e795155fc53675ce7083c7cc13d428f03083b4b9f82629"} Apr 20 20:46:10.671985 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:10.671986 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" event={"ID":"8aee1979-8a54-406e-82cf-e5b0144ed157","Type":"ContainerStarted","Data":"d9c8726324950f57affd11ae950445742ce893645c44c6ed1df4cd17099aa9aa"} Apr 20 20:46:10.672382 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:10.672190 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:10.688594 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:10.688554 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" podStartSLOduration=1.6885439020000002 podStartE2EDuration="1.688543902s" podCreationTimestamp="2026-04-20 20:46:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:46:10.687187543 +0000 UTC m=+2102.442558072" watchObservedRunningTime="2026-04-20 20:46:10.688543902 +0000 UTC m=+2102.443914427" Apr 20 20:46:11.586538 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:11.586506 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6b4d9_7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47/dns/0.log" Apr 20 20:46:11.606258 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:11.606237 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6b4d9_7c6ab6ed-0ff9-43e5-a89d-3b8dc9b75f47/kube-rbac-proxy/0.log" Apr 20 20:46:11.712029 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:11.712009 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-d96qr_803590cd-665f-48a2-83de-4637da6a6b00/dns-node-resolver/0.log" Apr 20 20:46:12.190920 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:12.190893 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-cd4b6f4f4-kcq97_cd3d12ef-f755-4800-bd09-5d0005e38f41/registry/0.log" Apr 20 20:46:12.250834 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:12.250809 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qgfkv_53da44a9-d18c-47a4-b3e2-f5b196e47cbe/node-ca/0.log" Apr 20 20:46:13.144008 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:13.143974 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-66df9c9b9f-9h5fb_38593f2b-438c-4aaf-951c-d88294b014b8/kube-auth-proxy/0.log" Apr 20 20:46:13.730414 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:13.730382 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-cxb87_ba7ed180-9b68-40c1-9f30-e9a6a5c96af3/serve-healthcheck-canary/0.log" Apr 20 20:46:14.335378 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:14.335349 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wv5kz_ae0abd1a-d13c-4cc9-b401-6022ce717e69/kube-rbac-proxy/0.log" Apr 20 20:46:14.366116 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:14.366091 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wv5kz_ae0abd1a-d13c-4cc9-b401-6022ce717e69/exporter/0.log" Apr 20 20:46:14.427468 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:14.427444 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wv5kz_ae0abd1a-d13c-4cc9-b401-6022ce717e69/extractor/0.log" Apr 20 20:46:16.324880 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:16.324852 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7bc8759798-ccpqc_25dd608d-02cb-486a-b792-a7a037bd8bb6/maas-api/0.log" Apr 20 20:46:16.400314 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:16.400288 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-5bdbc5f8f4-5vgjr_9fb75b85-c8b8-42cc-9dec-d9a4485b6879/manager/0.log" Apr 20 20:46:16.491967 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:16.491920 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7f7bf89c4-9tp86_4ad36871-29bc-44b2-b2f7-bc5a2763a8fc/manager/0.log" Apr 20 20:46:16.685017 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:16.684962 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-j776h" Apr 20 20:46:17.627385 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:17.627358 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-54d459c768-tk8nw_d86e9816-2979-4c19-ac99-08d1b5eab437/manager/0.log" Apr 20 20:46:23.488463 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:23.488433 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mql6h_4682a90f-335e-4cef-bd3a-448c0f2a267f/kube-multus-additional-cni-plugins/0.log" Apr 20 20:46:23.509609 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:23.509582 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mql6h_4682a90f-335e-4cef-bd3a-448c0f2a267f/egress-router-binary-copy/0.log" Apr 20 20:46:23.528750 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:23.528729 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mql6h_4682a90f-335e-4cef-bd3a-448c0f2a267f/cni-plugins/0.log" Apr 20 20:46:23.547660 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:23.547640 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mql6h_4682a90f-335e-4cef-bd3a-448c0f2a267f/bond-cni-plugin/0.log" Apr 20 20:46:23.566681 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:23.566661 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mql6h_4682a90f-335e-4cef-bd3a-448c0f2a267f/routeoverride-cni/0.log" Apr 20 20:46:23.585867 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:23.585844 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mql6h_4682a90f-335e-4cef-bd3a-448c0f2a267f/whereabouts-cni-bincopy/0.log" Apr 20 20:46:23.606076 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:23.606053 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mql6h_4682a90f-335e-4cef-bd3a-448c0f2a267f/whereabouts-cni/0.log" Apr 20 20:46:23.790167 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:23.790107 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jfpm5_ba687eca-e6d2-4355-91df-eb1ca17741fe/kube-multus/0.log" Apr 20 20:46:23.885227 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:23.885203 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z9tzr_ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9/network-metrics-daemon/0.log" Apr 20 20:46:23.905449 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:23.905415 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z9tzr_ac49f1d0-6c1e-4394-8c2b-7f5c9cac6ed9/kube-rbac-proxy/0.log" Apr 20 20:46:24.993751 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:24.993704 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-controller/0.log" Apr 20 20:46:25.010766 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:25.010737 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/0.log" Apr 20 20:46:25.027993 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:25.027970 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovn-acl-logging/1.log" Apr 20 20:46:25.049781 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:25.049740 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/kube-rbac-proxy-node/0.log" Apr 20 20:46:25.072360 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:25.072340 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 20:46:25.089031 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:25.089005 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/northd/0.log" Apr 20 20:46:25.110319 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:25.110292 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/nbdb/0.log" Apr 20 20:46:25.131218 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:25.131200 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/sbdb/0.log" Apr 20 20:46:25.297101 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:25.297037 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9hml_2bc1e339-b5d7-4ff2-81cc-110408fe4e5f/ovnkube-controller/0.log" Apr 20 20:46:26.628990 ip-10-0-141-183 kubenswrapper[2580]: I0420 20:46:26.628936 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-hftxf_0e633d12-e3fe-490f-b2ea-097490061435/network-check-target-container/0.log"