Apr 22 15:58:42.559929 ip-10-0-132-57 systemd[1]: Starting Kubernetes Kubelet... Apr 22 15:58:43.108553 ip-10-0-132-57 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:58:43.108553 ip-10-0-132-57 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 15:58:43.108553 ip-10-0-132-57 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:58:43.108553 ip-10-0-132-57 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 15:58:43.108553 ip-10-0-132-57 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:58:43.110480 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.110374 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 15:58:43.112920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112892 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:58:43.112920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112918 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:58:43.112920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112922 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:58:43.112920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112926 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112930 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112934 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112937 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112940 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112942 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112945 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112948 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112950 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112953 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112955 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112958 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112960 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112963 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112966 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112969 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112972 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112975 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112978 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112980 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:58:43.113046 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112983 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112985 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112987 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112990 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112992 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112995 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.112998 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113000 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113002 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113005 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113007 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113010 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113012 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113014 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113018 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113021 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113024 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113027 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113029 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113032 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:58:43.113561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113035 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113039 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113043 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113046 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113049 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113065 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113068 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113071 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113074 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113078 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113081 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113084 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113087 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113090 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113093 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113095 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113098 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113101 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113103 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:58:43.114304 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113106 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113108 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113111 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113115 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113117 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113120 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113122 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113126 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113128 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113132 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113137 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113140 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113142 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113145 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113148 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113150 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113153 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113155 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113158 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113160 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:58:43.115099 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113163 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113165 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113169 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.113171 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114255 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114272 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114277 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114282 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114286 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114291 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114301 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114305 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114309 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114316 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114323 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114328 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114333 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114337 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114341 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:58:43.115907 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114348 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114353 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114358 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114367 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114371 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114375 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114379 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114383 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114387 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114391 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114395 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114398 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114402 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114406 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114411 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114416 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114426 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114430 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114434 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:58:43.116796 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114438 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114442 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114447 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114451 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114455 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114460 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114464 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114468 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114472 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114476 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114488 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114492 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114497 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114502 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114507 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114511 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114515 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114519 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114547 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114554 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:58:43.117561 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114559 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114569 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114573 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114578 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114582 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114586 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114591 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114596 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114600 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114606 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114611 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114616 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114620 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114624 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114637 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114642 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114646 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114651 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.114656 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.115196 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:58:43.118214 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.115372 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.115378 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.115384 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.115389 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.115395 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.115400 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.115406 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.115410 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.115414 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.115418 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.115423 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.115427 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115574 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115591 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115603 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115610 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115618 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115623 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115631 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115639 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115644 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 15:58:43.118854 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115649 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115655 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115662 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115667 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115672 2576 flags.go:64] FLAG: --cgroup-root="" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115678 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115683 2576 flags.go:64] FLAG: --client-ca-file="" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115687 2576 flags.go:64] FLAG: --cloud-config="" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115692 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115697 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115705 2576 flags.go:64] FLAG: --cluster-domain="" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115711 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115717 2576 flags.go:64] FLAG: --config-dir="" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115722 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115727 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115734 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115739 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115744 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115750 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115756 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115760 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115765 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115771 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115776 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115784 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 15:58:43.119375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115789 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115794 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115798 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115803 2576 flags.go:64] FLAG: --enable-server="true" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115808 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115816 2576 flags.go:64] FLAG: --event-burst="100" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115821 2576 flags.go:64] FLAG: --event-qps="50" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115826 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115831 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115838 2576 flags.go:64] FLAG: --eviction-hard="" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115844 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115849 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115854 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115858 2576 flags.go:64] FLAG: --eviction-soft="" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115863 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115868 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115874 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115879 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115884 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115889 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115893 2576 flags.go:64] FLAG: --feature-gates="" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115900 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115905 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115910 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115914 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115919 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 22 15:58:43.120129 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115924 2576 flags.go:64] FLAG: --help="false" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115929 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115933 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115938 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115944 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115951 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115957 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115962 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115966 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115971 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115976 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115981 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115987 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115992 2576 flags.go:64] FLAG: --kube-reserved="" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.115997 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116002 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116007 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116012 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116016 2576 flags.go:64] FLAG: --lock-file="" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116021 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116025 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116031 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116041 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 15:58:43.120820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116046 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116050 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116055 2576 flags.go:64] FLAG: --logging-format="text" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116060 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116067 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116072 2576 flags.go:64] FLAG: --manifest-url="" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116077 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116086 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116091 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116098 2576 flags.go:64] FLAG: --max-pods="110" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116102 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116107 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116112 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116117 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116122 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116127 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116132 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116147 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116152 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116162 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116167 2576 flags.go:64] FLAG: --pod-cidr="" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116172 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116181 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116186 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 15:58:43.121422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116192 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116196 2576 flags.go:64] FLAG: --port="10250" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116201 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116206 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0abe0e9ab33478418" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116211 2576 flags.go:64] FLAG: --qos-reserved="" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116217 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116222 2576 flags.go:64] FLAG: --register-node="true" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116227 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116232 2576 flags.go:64] FLAG: --register-with-taints="" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116238 2576 flags.go:64] FLAG: --registry-burst="10" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116243 2576 flags.go:64] FLAG: --registry-qps="5" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116248 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116252 2576 flags.go:64] FLAG: --reserved-memory="" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116259 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116265 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116270 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116275 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116283 2576 flags.go:64] FLAG: --runonce="false" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116287 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116292 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116297 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116302 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116307 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116313 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116319 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116324 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 15:58:43.122028 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116328 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116335 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116339 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116344 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116349 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116355 2576 flags.go:64] FLAG: --system-cgroups="" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116359 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116368 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116373 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116378 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116386 2576 flags.go:64] FLAG: --tls-min-version="" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116392 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116399 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116406 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116412 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116417 2576 flags.go:64] FLAG: --v="2" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116430 2576 flags.go:64] FLAG: --version="false" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116437 2576 flags.go:64] FLAG: --vmodule="" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116444 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.116450 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116646 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116656 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116661 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116666 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:58:43.122683 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116674 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116679 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116684 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116688 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116693 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116697 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116701 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116706 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116710 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116716 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116721 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116725 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116729 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116733 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116737 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116741 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116746 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116750 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116754 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:58:43.123279 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116759 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116763 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116768 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116772 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116777 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116781 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116785 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116789 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116794 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116799 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116802 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116807 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116810 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116814 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116819 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116824 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116828 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116832 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116836 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116840 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:58:43.123906 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116844 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116848 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116854 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116858 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116863 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116867 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116872 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116877 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116881 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116885 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116890 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116894 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116898 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116902 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116906 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116911 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116916 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116920 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116925 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116930 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:58:43.124405 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116935 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116942 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116948 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116953 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116958 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116963 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116968 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116973 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116978 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116982 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116988 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116995 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.116999 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.117003 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.117014 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.117018 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.117023 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.117027 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.117031 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:58:43.124973 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.117035 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:58:43.125458 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.117039 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:58:43.125458 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.117043 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:58:43.125458 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.117048 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:58:43.125458 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.117080 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:58:43.125775 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.125745 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 15:58:43.125813 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.125776 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 15:58:43.125844 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125837 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:58:43.125844 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125843 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125847 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125851 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125854 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125857 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125860 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125863 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125866 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125868 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125871 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125873 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125876 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125879 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125882 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125884 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125887 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125890 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125892 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125895 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125897 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:58:43.125900 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125900 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125903 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125905 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125908 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125911 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125914 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125917 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125919 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125922 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125924 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125928 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125931 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125933 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125936 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125938 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125941 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125944 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125946 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125948 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125951 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:58:43.126394 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125953 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125955 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125958 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125960 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125962 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125965 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125967 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125970 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125972 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125975 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125977 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125982 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125986 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125989 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125992 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125996 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.125999 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126001 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126004 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126006 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:58:43.126920 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126008 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126011 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126015 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126022 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126025 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126028 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126030 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126033 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126035 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126038 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126040 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126043 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126046 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126049 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126051 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126053 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126056 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126059 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126062 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126064 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:58:43.127415 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126067 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:58:43.128065 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126070 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:58:43.128065 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126072 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:58:43.128065 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126075 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:58:43.128065 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126078 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:58:43.128065 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.126084 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:58:43.128065 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126220 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:58:43.128065 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126228 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:58:43.128065 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126231 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:58:43.128065 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126234 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:58:43.128065 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126237 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:58:43.128065 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126240 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:58:43.128065 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126244 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:58:43.128065 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126246 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:58:43.128065 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126249 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:58:43.128065 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126252 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126255 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126258 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126261 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126263 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126266 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126268 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126271 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126274 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126277 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126279 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126282 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126286 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126289 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126292 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126295 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126297 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126299 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126302 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:58:43.128457 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126305 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126307 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126309 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126312 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126316 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126319 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126322 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126324 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126327 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126330 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126332 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126335 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126338 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126340 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126343 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126346 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126349 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126352 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126354 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126356 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:58:43.128944 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126359 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126361 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126364 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126366 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126368 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126371 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126374 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126376 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126379 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126381 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126384 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126386 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126389 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126391 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126394 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126396 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126400 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126403 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126406 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126408 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:58:43.129460 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126411 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:58:43.129971 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126414 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:58:43.129971 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126417 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:58:43.129971 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126419 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:58:43.129971 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126422 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:58:43.129971 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126424 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:58:43.129971 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126427 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:58:43.129971 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126429 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:58:43.129971 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126433 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:58:43.129971 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126435 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:58:43.129971 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126438 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:58:43.129971 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126440 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:58:43.129971 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126443 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:58:43.129971 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126447 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:58:43.129971 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126451 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:58:43.129971 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126454 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:58:43.129971 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126457 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:58:43.129971 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:43.126459 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:58:43.130383 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.126464 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:58:43.130383 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.127406 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 15:58:43.130589 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.130568 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 15:58:43.131864 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.131847 2576 server.go:1019] "Starting client certificate rotation" Apr 22 15:58:43.131973 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.131951 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:58:43.132004 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.131994 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:58:43.162681 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.162652 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:58:43.167602 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.167579 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:58:43.183658 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.183625 2576 log.go:25] "Validated CRI v1 runtime API" Apr 22 15:58:43.192335 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.192305 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:58:43.193180 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.193163 2576 log.go:25] "Validated CRI v1 image API" Apr 22 15:58:43.194859 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.194839 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 15:58:43.201107 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.201069 2576 fs.go:135] Filesystem UUIDs: map[06529518-432c-4679-9975-fea54ec03427:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 e165ea17-f046-40c2-9372-b07e68a2e383:/dev/nvme0n1p3] Apr 22 15:58:43.201107 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.201099 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 15:58:43.206333 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.206304 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-k9x7w" Apr 22 15:58:43.208873 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.208737 2576 manager.go:217] Machine: {Timestamp:2026-04-22 15:58:43.205802179 +0000 UTC m=+0.497990894 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099210 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a04fbae73425cf3cfd6bd3920efc1 SystemUUID:ec2a04fb-ae73-425c-f3cf-d6bd3920efc1 BootID:ee02020a-6bd7-486d-a4c1-6a0582b9f691 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:32:37:fe:55:eb Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:32:37:fe:55:eb Speed:0 Mtu:9001} {Name:ovs-system MacAddress:16:9c:8d:6b:f4:aa Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 15:58:43.208873 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.208864 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 15:58:43.208996 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.208973 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 15:58:43.212604 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.212549 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 15:58:43.212842 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.212605 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-57.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 15:58:43.212892 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.212843 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 15:58:43.212892 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.212853 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 15:58:43.212892 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.212867 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:58:43.213038 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.213020 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-k9x7w" Apr 22 15:58:43.213955 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.213939 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:58:43.214910 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.214897 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:58:43.215050 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.215040 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 15:58:43.217867 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.217852 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 22 15:58:43.218550 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.218538 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 15:58:43.218586 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.218563 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 15:58:43.218586 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.218573 2576 kubelet.go:397] "Adding apiserver pod source" Apr 22 15:58:43.218586 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.218584 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 15:58:43.220045 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.220024 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:58:43.220114 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.220059 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:58:43.224258 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.224234 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 15:58:43.225836 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.225815 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 15:58:43.227835 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.227818 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 15:58:43.227923 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.227842 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 15:58:43.227923 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.227850 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 15:58:43.227923 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.227858 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 15:58:43.227923 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.227869 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 15:58:43.227923 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.227878 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 15:58:43.227923 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.227889 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 15:58:43.227923 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.227898 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 15:58:43.227923 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.227908 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 15:58:43.227923 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.227918 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 15:58:43.228192 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.227932 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 15:58:43.228192 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.227947 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 15:58:43.229067 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.229038 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 15:58:43.229121 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.229069 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 15:58:43.231760 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.231734 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:43.234284 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.234259 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 15:58:43.234404 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.234316 2576 server.go:1295] "Started kubelet" Apr 22 15:58:43.234451 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.234419 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 15:58:43.234492 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.234413 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 15:58:43.234492 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.234489 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 15:58:43.234803 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.234780 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:43.235439 ip-10-0-132-57 systemd[1]: Started Kubernetes Kubelet. Apr 22 15:58:43.238470 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.238447 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 22 15:58:43.239445 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.238445 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 15:58:43.240257 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.240234 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-57.ec2.internal" not found Apr 22 15:58:43.244731 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.244706 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 15:58:43.244731 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.244715 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 15:58:43.245352 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.245321 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 15:58:43.245352 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.245341 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 15:58:43.245352 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.245343 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 15:58:43.245581 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.245455 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 22 15:58:43.245581 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.245464 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 22 15:58:43.245989 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.245974 2576 factory.go:55] Registering systemd factory Apr 22 15:58:43.246047 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.245993 2576 factory.go:223] Registration of the systemd container factory successfully Apr 22 15:58:43.246190 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:43.246169 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-57.ec2.internal\" not found" Apr 22 15:58:43.246295 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.246237 2576 factory.go:153] Registering CRI-O factory Apr 22 15:58:43.246295 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.246254 2576 factory.go:223] Registration of the crio container factory successfully Apr 22 15:58:43.246383 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.246318 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 15:58:43.246383 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.246344 2576 factory.go:103] Registering Raw factory Apr 22 15:58:43.246383 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.246361 2576 manager.go:1196] Started watching for new ooms in manager Apr 22 15:58:43.246832 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.246814 2576 manager.go:319] Starting recovery of all containers Apr 22 15:58:43.246946 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:43.246893 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 15:58:43.247032 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.247009 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:43.250072 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:43.249829 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-57.ec2.internal\" not found" node="ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.255937 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.255912 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-57.ec2.internal" not found Apr 22 15:58:43.257599 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.257577 2576 manager.go:324] Recovery completed Apr 22 15:58:43.259057 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:43.258961 2576 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 15:58:43.262219 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.262203 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:58:43.264331 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.264312 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:58:43.264406 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.264348 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:58:43.264406 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.264360 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:58:43.264898 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.264883 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 15:58:43.264898 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.264895 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 15:58:43.265020 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.264932 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:58:43.267190 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.267174 2576 policy_none.go:49] "None policy: Start" Apr 22 15:58:43.267275 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.267196 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 15:58:43.267275 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.267211 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 22 15:58:43.311653 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.311625 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-57.ec2.internal" not found Apr 22 15:58:43.323195 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.313262 2576 manager.go:341] "Starting Device Plugin manager" Apr 22 15:58:43.323195 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:43.313298 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 15:58:43.323195 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.313311 2576 server.go:85] "Starting device plugin registration server" Apr 22 15:58:43.323195 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.313675 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 15:58:43.323195 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.313688 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 15:58:43.323195 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.313810 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 15:58:43.323195 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.313888 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 15:58:43.323195 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.313897 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 15:58:43.323195 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:43.314493 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 15:58:43.323195 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:43.314548 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-57.ec2.internal\" not found" Apr 22 15:58:43.385772 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.385678 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 15:58:43.387110 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.387087 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 15:58:43.387259 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.387120 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 15:58:43.387259 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.387147 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 15:58:43.387259 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.387156 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 15:58:43.387259 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:43.387198 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 15:58:43.390788 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.390764 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:43.414864 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.414823 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:58:43.416386 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.416362 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:58:43.416502 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.416400 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:58:43.416502 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.416414 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:58:43.416502 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.416445 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.425666 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.425644 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.487347 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.487261 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal"] Apr 22 15:58:43.492021 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.491997 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.492116 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.492003 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.518599 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.518563 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.523368 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.523342 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.527238 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.527213 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:58:43.536074 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.536052 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:58:43.547440 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.547409 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e758ab3eb0bee98b6ae04d49ace534ee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal\" (UID: \"e758ab3eb0bee98b6ae04d49ace534ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.547623 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.547445 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e758ab3eb0bee98b6ae04d49ace534ee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal\" (UID: \"e758ab3eb0bee98b6ae04d49ace534ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.547623 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.547483 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ad918dee123885abb804caebda37d740-config\") pod \"kube-apiserver-proxy-ip-10-0-132-57.ec2.internal\" (UID: \"ad918dee123885abb804caebda37d740\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.648196 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.648105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e758ab3eb0bee98b6ae04d49ace534ee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal\" (UID: \"e758ab3eb0bee98b6ae04d49ace534ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.648196 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.648139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e758ab3eb0bee98b6ae04d49ace534ee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal\" (UID: \"e758ab3eb0bee98b6ae04d49ace534ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.648196 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.648160 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ad918dee123885abb804caebda37d740-config\") pod \"kube-apiserver-proxy-ip-10-0-132-57.ec2.internal\" (UID: \"ad918dee123885abb804caebda37d740\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.648394 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.648233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e758ab3eb0bee98b6ae04d49ace534ee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal\" (UID: \"e758ab3eb0bee98b6ae04d49ace534ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.648394 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.648231 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ad918dee123885abb804caebda37d740-config\") pod \"kube-apiserver-proxy-ip-10-0-132-57.ec2.internal\" (UID: \"ad918dee123885abb804caebda37d740\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.648394 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.648233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e758ab3eb0bee98b6ae04d49ace534ee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal\" (UID: \"e758ab3eb0bee98b6ae04d49ace534ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.830228 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.830180 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" Apr 22 15:58:43.839017 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:43.838984 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 22 15:58:44.131919 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.131824 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 15:58:44.132759 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.132011 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:58:44.132759 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.132035 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:58:44.132759 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.132049 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:58:44.214764 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.214688 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 15:53:43 +0000 UTC" deadline="2027-12-24 06:46:41.711971022 +0000 UTC" Apr 22 15:58:44.214764 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.214762 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14654h47m57.497213196s" Apr 22 15:58:44.218809 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.218783 2576 apiserver.go:52] "Watching apiserver" Apr 22 15:58:44.225003 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.224968 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 15:58:44.226207 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.226177 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal","openshift-multus/multus-nh7h2","openshift-multus/network-metrics-daemon-5v2vn","openshift-network-operator/iptables-alerter-fnrw7","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd","openshift-cluster-node-tuning-operator/tuned-vdtff","openshift-dns/node-resolver-nq959","openshift-image-registry/node-ca-4gt4d","openshift-multus/multus-additional-cni-plugins-kvscs","openshift-network-diagnostics/network-check-target-cxpcn","openshift-ovn-kubernetes/ovnkube-node-pxnsg","kube-system/konnectivity-agent-4qb9z"] Apr 22 15:58:44.229004 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.228974 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.231118 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.231083 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 15:58:44.231118 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.231113 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 15:58:44.231290 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.231233 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:44.231424 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.231408 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 15:58:44.231497 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.231424 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mfpfn\"" Apr 22 15:58:44.231497 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:44.231399 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5v2vn" podUID="13a488e0-8f15-4fd1-8913-c002ea52d186" Apr 22 15:58:44.231626 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.231511 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 15:58:44.233516 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.233495 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fnrw7" Apr 22 15:58:44.235305 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.235283 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:58:44.235436 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.235420 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 15:58:44.235651 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.235628 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-4cp5h\"" Apr 22 15:58:44.235651 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.235639 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 15:58:44.235778 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.235756 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.237587 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.237571 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wnwkd\"" Apr 22 15:58:44.237764 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.237751 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 15:58:44.237815 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.237749 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 15:58:44.237914 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.237899 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.237976 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.237923 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 15:58:44.240038 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.240010 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:58:44.240157 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.240145 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7szk9\"" Apr 22 15:58:44.240516 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.240498 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 15:58:44.240618 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.240563 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nq959" Apr 22 15:58:44.242770 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.242748 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 15:58:44.243194 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.243161 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 15:58:44.243194 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.243173 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-hfl9r\"" Apr 22 15:58:44.243346 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.243295 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4gt4d" Apr 22 15:58:44.244864 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.244846 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 15:58:44.244976 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.244968 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pmvxt\"" Apr 22 15:58:44.245039 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.245024 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 15:58:44.245160 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.245144 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 15:58:44.245301 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.245289 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 15:58:44.245522 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.245505 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.247283 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.247260 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 15:58:44.247377 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.247343 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-pqx4x\"" Apr 22 15:58:44.247619 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.247510 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 15:58:44.248089 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.248074 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:58:44.248180 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:44.248161 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxpcn" podUID="e37369f9-fa77-49bc-b161-0e8777c7ef13" Apr 22 15:58:44.251200 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.251175 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.252739 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.252713 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.252855 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.252745 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-run-multus-certs\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.252855 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.252762 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/de01641f-09c5-4c2b-a960-1a02116acfd0-host-slash\") pod \"iptables-alerter-fnrw7\" (UID: \"de01641f-09c5-4c2b-a960-1a02116acfd0\") " pod="openshift-network-operator/iptables-alerter-fnrw7" Apr 22 15:58:44.252855 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.252782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-run-netns\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.252855 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.252798 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-var-lib-kubelet\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.252855 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.252816 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8nw9\" (UniqueName: \"kubernetes.io/projected/284f7a34-e743-4f00-9226-bfcbfbabe4a4-kube-api-access-l8nw9\") pod \"node-ca-4gt4d\" (UID: \"284f7a34-e743-4f00-9226-bfcbfbabe4a4\") " pod="openshift-image-registry/node-ca-4gt4d" Apr 22 15:58:44.252855 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.252831 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-modprobe-d\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.253166 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.252871 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-cni-binary-copy\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.253166 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.252911 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-os-release\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.253166 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.252940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-var-lib-cni-bin\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.253166 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.252967 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs\") pod \"network-metrics-daemon-5v2vn\" (UID: \"13a488e0-8f15-4fd1-8913-c002ea52d186\") " pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:44.253166 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.252991 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-kubernetes\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.253166 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253016 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/284f7a34-e743-4f00-9226-bfcbfbabe4a4-serviceca\") pod \"node-ca-4gt4d\" (UID: \"284f7a34-e743-4f00-9226-bfcbfbabe4a4\") " pod="openshift-image-registry/node-ca-4gt4d" Apr 22 15:58:44.253166 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-multus-cni-dir\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.253166 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253079 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-cnibin\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.253166 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253106 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-multus-socket-dir-parent\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.253166 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253133 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-system-cni-dir\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.253166 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253161 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-socket-dir\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-systemd\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253210 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0f691780-d61d-4734-a36c-01d15ac43908-hosts-file\") pod \"node-resolver-nq959\" (UID: \"0f691780-d61d-4734-a36c-01d15ac43908\") " pod="openshift-dns/node-resolver-nq959" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253256 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-system-cni-dir\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253268 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253304 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-registration-dir\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253335 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-sys-fs\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253342 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253366 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2qf2\" (UniqueName: \"kubernetes.io/projected/0f691780-d61d-4734-a36c-01d15ac43908-kube-api-access-c2qf2\") pod \"node-resolver-nq959\" (UID: \"0f691780-d61d-4734-a36c-01d15ac43908\") " pod="openshift-dns/node-resolver-nq959" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253388 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-cnibin\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253424 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/de01641f-09c5-4c2b-a960-1a02116acfd0-iptables-alerter-script\") pod \"iptables-alerter-fnrw7\" (UID: \"de01641f-09c5-4c2b-a960-1a02116acfd0\") " pod="openshift-network-operator/iptables-alerter-fnrw7" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253456 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqc4f\" (UniqueName: \"kubernetes.io/projected/de01641f-09c5-4c2b-a960-1a02116acfd0-kube-api-access-nqc4f\") pod \"iptables-alerter-fnrw7\" (UID: \"de01641f-09c5-4c2b-a960-1a02116acfd0\") " pod="openshift-network-operator/iptables-alerter-fnrw7" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253483 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsl5t\" (UniqueName: \"kubernetes.io/projected/1ac62026-a809-4e5a-9c42-aa4788090bf2-kube-api-access-dsl5t\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253488 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253507 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-sysctl-d\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253512 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-d8m5g\"" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253543 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253512 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4qb9z" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253513 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-lib-modules\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.253897 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-484d7\" (UniqueName: \"kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7\") pod \"network-check-target-cxpcn\" (UID: \"e37369f9-fa77-49bc-b161-0e8777c7ef13\") " pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:58:44.254602 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253696 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-device-dir\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.254602 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253721 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0f691780-d61d-4734-a36c-01d15ac43908-tmp-dir\") pod \"node-resolver-nq959\" (UID: \"0f691780-d61d-4734-a36c-01d15ac43908\") " pod="openshift-dns/node-resolver-nq959" Apr 22 15:58:44.254602 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253748 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.254602 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253775 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-tuned\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.254602 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253798 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-os-release\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.254602 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253816 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdc9d\" (UniqueName: \"kubernetes.io/projected/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-kube-api-access-pdc9d\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.254602 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253832 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbvpb\" (UniqueName: \"kubernetes.io/projected/13a488e0-8f15-4fd1-8913-c002ea52d186-kube-api-access-kbvpb\") pod \"network-metrics-daemon-5v2vn\" (UID: \"13a488e0-8f15-4fd1-8913-c002ea52d186\") " pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:44.254602 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253852 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.254602 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253872 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzrm4\" (UniqueName: \"kubernetes.io/projected/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-kube-api-access-xzrm4\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.254602 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253890 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-var-lib-cni-multus\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.254602 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253904 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-multus-conf-dir\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.254602 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-etc-kubernetes\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.254602 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253934 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-sysctl-conf\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.254602 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-run\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.254602 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.253988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc9e89c7-0a84-4831-b7e5-5189f30696e3-tmp\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.254602 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.254003 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-hostroot\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.255069 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.254024 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/284f7a34-e743-4f00-9226-bfcbfbabe4a4-host\") pod \"node-ca-4gt4d\" (UID: \"284f7a34-e743-4f00-9226-bfcbfbabe4a4\") " pod="openshift-image-registry/node-ca-4gt4d" Apr 22 15:58:44.255069 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.254062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-sysconfig\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.255069 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.254086 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-var-lib-kubelet\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.255069 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.254107 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.255069 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.254157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-etc-selinux\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.255069 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.254191 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-sys\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.255069 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.254208 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqq2q\" (UniqueName: \"kubernetes.io/projected/fc9e89c7-0a84-4831-b7e5-5189f30696e3-kube-api-access-lqq2q\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.255069 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.254223 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-cni-binary-copy\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.255069 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.254238 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-run-k8s-cni-cncf-io\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.255069 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.254253 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-multus-daemon-config\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.255069 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.254270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-host\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.255828 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.255810 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 15:58:44.256104 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.256088 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 15:58:44.256155 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.256112 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8w28x\"" Apr 22 15:58:44.258182 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.258156 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:58:44.277851 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.277821 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xcgn5" Apr 22 15:58:44.284846 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.284819 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xcgn5" Apr 22 15:58:44.346137 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.346106 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 15:58:44.354688 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.354657 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-sysctl-conf\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.354688 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.354690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-run\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.354923 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.354714 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc9e89c7-0a84-4831-b7e5-5189f30696e3-tmp\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.354923 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.354730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-hostroot\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.354923 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.354769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/284f7a34-e743-4f00-9226-bfcbfbabe4a4-host\") pod \"node-ca-4gt4d\" (UID: \"284f7a34-e743-4f00-9226-bfcbfbabe4a4\") " pod="openshift-image-registry/node-ca-4gt4d" Apr 22 15:58:44.354923 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.354796 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.354923 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.354817 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d3e14dff-8806-4eb4-92e8-68169209c285-ovnkube-config\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.354923 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.354837 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d3e14dff-8806-4eb4-92e8-68169209c285-env-overrides\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.354923 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.354858 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-sysctl-conf\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.354923 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.354903 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-hostroot\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.355298 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.354896 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-run\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.355298 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.354939 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/284f7a34-e743-4f00-9226-bfcbfbabe4a4-host\") pod \"node-ca-4gt4d\" (UID: \"284f7a34-e743-4f00-9226-bfcbfbabe4a4\") " pod="openshift-image-registry/node-ca-4gt4d" Apr 22 15:58:44.355298 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.354982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-sysconfig\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.355298 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-var-lib-kubelet\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.355298 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.355298 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355072 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-run-ovn\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.355298 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355087 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-var-lib-kubelet\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.355298 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355098 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmzwb\" (UniqueName: \"kubernetes.io/projected/d3e14dff-8806-4eb4-92e8-68169209c285-kube-api-access-pmzwb\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.355298 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-sysconfig\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.355298 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-etc-selinux\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.355298 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355111 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 15:58:44.355298 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355179 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-etc-selinux\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.355298 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355207 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-sys\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.355298 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqq2q\" (UniqueName: \"kubernetes.io/projected/fc9e89c7-0a84-4831-b7e5-5189f30696e3-kube-api-access-lqq2q\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.355298 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355251 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-cni-binary-copy\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.355298 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-run-k8s-cni-cncf-io\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.355298 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355298 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-multus-daemon-config\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.356135 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355324 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-cni-bin\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.356135 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-host\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.356135 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355372 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.356135 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-run-multus-certs\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.356135 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/de01641f-09c5-4c2b-a960-1a02116acfd0-host-slash\") pod \"iptables-alerter-fnrw7\" (UID: \"de01641f-09c5-4c2b-a960-1a02116acfd0\") " pod="openshift-network-operator/iptables-alerter-fnrw7" Apr 22 15:58:44.356135 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355450 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-node-log\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.356135 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2f956ae0-432f-4703-aab0-141a0a0a573c-agent-certs\") pod \"konnectivity-agent-4qb9z\" (UID: \"2f956ae0-432f-4703-aab0-141a0a0a573c\") " pod="kube-system/konnectivity-agent-4qb9z" Apr 22 15:58:44.356135 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-run-netns\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.356135 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-var-lib-kubelet\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.356135 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8nw9\" (UniqueName: \"kubernetes.io/projected/284f7a34-e743-4f00-9226-bfcbfbabe4a4-kube-api-access-l8nw9\") pod \"node-ca-4gt4d\" (UID: \"284f7a34-e743-4f00-9226-bfcbfbabe4a4\") " pod="openshift-image-registry/node-ca-4gt4d" Apr 22 15:58:44.356135 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355581 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-run-multus-certs\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.356135 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355603 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-slash\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.356135 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-cni-netd\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.356135 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-host\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.356135 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355653 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-modprobe-d\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.356135 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355685 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-cni-binary-copy\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.356135 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355687 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.356771 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-os-release\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.356771 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-var-lib-cni-bin\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.356771 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355763 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.356771 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-os-release\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.356771 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355250 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-sys\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.356771 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355782 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs\") pod \"network-metrics-daemon-5v2vn\" (UID: \"13a488e0-8f15-4fd1-8913-c002ea52d186\") " pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:44.356771 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355826 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2f956ae0-432f-4703-aab0-141a0a0a573c-konnectivity-ca\") pod \"konnectivity-agent-4qb9z\" (UID: \"2f956ae0-432f-4703-aab0-141a0a0a573c\") " pod="kube-system/konnectivity-agent-4qb9z" Apr 22 15:58:44.356771 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355846 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/de01641f-09c5-4c2b-a960-1a02116acfd0-host-slash\") pod \"iptables-alerter-fnrw7\" (UID: \"de01641f-09c5-4c2b-a960-1a02116acfd0\") " pod="openshift-network-operator/iptables-alerter-fnrw7" Apr 22 15:58:44.356771 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355858 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-kubernetes\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.356771 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:44.355879 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:44.356771 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355885 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/284f7a34-e743-4f00-9226-bfcbfbabe4a4-serviceca\") pod \"node-ca-4gt4d\" (UID: \"284f7a34-e743-4f00-9226-bfcbfbabe4a4\") " pod="openshift-image-registry/node-ca-4gt4d" Apr 22 15:58:44.356771 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-run-k8s-cni-cncf-io\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.356771 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355913 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-multus-cni-dir\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.356771 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-cnibin\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.356771 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355944 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-multus-daemon-config\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.356771 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355966 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-multus-socket-dir-parent\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.356771 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:44.356017 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs podName:13a488e0-8f15-4fd1-8913-c002ea52d186 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:44.855973704 +0000 UTC m=+2.148162445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs") pod "network-metrics-daemon-5v2vn" (UID: "13a488e0-8f15-4fd1-8913-c002ea52d186") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:44.357371 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356053 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-system-cni-dir\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.357371 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356086 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-socket-dir\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.357371 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-systemd\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.357371 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0f691780-d61d-4734-a36c-01d15ac43908-hosts-file\") pod \"node-resolver-nq959\" (UID: \"0f691780-d61d-4734-a36c-01d15ac43908\") " pod="openshift-dns/node-resolver-nq959" Apr 22 15:58:44.357371 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-system-cni-dir\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.357371 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-systemd-units\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.357371 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356221 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-run-systemd\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.357371 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-registration-dir\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.357371 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356278 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-sys-fs\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.357371 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2qf2\" (UniqueName: \"kubernetes.io/projected/0f691780-d61d-4734-a36c-01d15ac43908-kube-api-access-c2qf2\") pod \"node-resolver-nq959\" (UID: \"0f691780-d61d-4734-a36c-01d15ac43908\") " pod="openshift-dns/node-resolver-nq959" Apr 22 15:58:44.357371 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356372 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-cnibin\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.357371 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356398 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/de01641f-09c5-4c2b-a960-1a02116acfd0-iptables-alerter-script\") pod \"iptables-alerter-fnrw7\" (UID: \"de01641f-09c5-4c2b-a960-1a02116acfd0\") " pod="openshift-network-operator/iptables-alerter-fnrw7" Apr 22 15:58:44.357371 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqc4f\" (UniqueName: \"kubernetes.io/projected/de01641f-09c5-4c2b-a960-1a02116acfd0-kube-api-access-nqc4f\") pod \"iptables-alerter-fnrw7\" (UID: \"de01641f-09c5-4c2b-a960-1a02116acfd0\") " pod="openshift-network-operator/iptables-alerter-fnrw7" Apr 22 15:58:44.357371 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356468 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-cni-binary-copy\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.357371 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-etc-openvswitch\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.357371 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356547 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-kubernetes\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.357371 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356468 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/284f7a34-e743-4f00-9226-bfcbfbabe4a4-serviceca\") pod \"node-ca-4gt4d\" (UID: \"284f7a34-e743-4f00-9226-bfcbfbabe4a4\") " pod="openshift-image-registry/node-ca-4gt4d" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356569 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-cnibin\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356598 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-multus-cni-dir\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356624 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-run-netns\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356626 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-var-lib-kubelet\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356015 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-multus-socket-dir-parent\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.355882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-var-lib-cni-bin\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-system-cni-dir\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356722 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0f691780-d61d-4734-a36c-01d15ac43908-hosts-file\") pod \"node-resolver-nq959\" (UID: \"0f691780-d61d-4734-a36c-01d15ac43908\") " pod="openshift-dns/node-resolver-nq959" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356753 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-modprobe-d\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356763 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-log-socket\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-system-cni-dir\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356819 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-registration-dir\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-cnibin\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356932 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-socket-dir\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.356975 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-systemd\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsl5t\" (UniqueName: \"kubernetes.io/projected/1ac62026-a809-4e5a-9c42-aa4788090bf2-kube-api-access-dsl5t\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-sys-fs\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.358061 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357249 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-sysctl-d\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.358802 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357278 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-lib-modules\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.358802 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-run-netns\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.358802 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357339 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/de01641f-09c5-4c2b-a960-1a02116acfd0-iptables-alerter-script\") pod \"iptables-alerter-fnrw7\" (UID: \"de01641f-09c5-4c2b-a960-1a02116acfd0\") " pod="openshift-network-operator/iptables-alerter-fnrw7" Apr 22 15:58:44.358802 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-cni-binary-copy\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.358802 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357332 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-var-lib-openvswitch\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.358802 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357399 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-sysctl-d\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.358802 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357404 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d3e14dff-8806-4eb4-92e8-68169209c285-ovn-node-metrics-cert\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.358802 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357453 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc9e89c7-0a84-4831-b7e5-5189f30696e3-lib-modules\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.358802 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357561 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-484d7\" (UniqueName: \"kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7\") pod \"network-check-target-cxpcn\" (UID: \"e37369f9-fa77-49bc-b161-0e8777c7ef13\") " pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:58:44.358802 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357593 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-kubelet\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.358802 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357720 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-run-openvswitch\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.358802 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357771 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d3e14dff-8806-4eb4-92e8-68169209c285-ovnkube-script-lib\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.358802 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357802 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-device-dir\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.358802 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357843 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-device-dir\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.358802 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0f691780-d61d-4734-a36c-01d15ac43908-tmp-dir\") pod \"node-resolver-nq959\" (UID: \"0f691780-d61d-4734-a36c-01d15ac43908\") " pod="openshift-dns/node-resolver-nq959" Apr 22 15:58:44.358802 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357879 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.358802 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357900 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-tuned\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.359884 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-os-release\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.359884 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357972 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdc9d\" (UniqueName: \"kubernetes.io/projected/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-kube-api-access-pdc9d\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.359884 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.357997 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbvpb\" (UniqueName: \"kubernetes.io/projected/13a488e0-8f15-4fd1-8913-c002ea52d186-kube-api-access-kbvpb\") pod \"network-metrics-daemon-5v2vn\" (UID: \"13a488e0-8f15-4fd1-8913-c002ea52d186\") " pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:44.359884 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.358021 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.359884 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.358046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzrm4\" (UniqueName: \"kubernetes.io/projected/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-kube-api-access-xzrm4\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.359884 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.358069 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-var-lib-cni-multus\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.359884 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.358111 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0f691780-d61d-4734-a36c-01d15ac43908-tmp-dir\") pod \"node-resolver-nq959\" (UID: \"0f691780-d61d-4734-a36c-01d15ac43908\") " pod="openshift-dns/node-resolver-nq959" Apr 22 15:58:44.359884 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.358159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-multus-conf-dir\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.359884 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.358093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-multus-conf-dir\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.359884 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.358203 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-etc-kubernetes\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.359884 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.358233 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-run-ovn-kubernetes\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.359884 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.358338 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-host-var-lib-cni-multus\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.359884 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.358339 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ac62026-a809-4e5a-9c42-aa4788090bf2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.359884 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.358363 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.359884 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.358386 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-etc-kubernetes\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.359884 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.358400 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-os-release\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.361567 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.361519 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc9e89c7-0a84-4831-b7e5-5189f30696e3-tmp\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.361705 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.361683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fc9e89c7-0a84-4831-b7e5-5189f30696e3-etc-tuned\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.365245 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:44.363659 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:58:44.365245 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:44.363686 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:58:44.365245 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:44.363701 2576 projected.go:194] Error preparing data for projected volume kube-api-access-484d7 for pod openshift-network-diagnostics/network-check-target-cxpcn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:44.365245 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:44.363829 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7 podName:e37369f9-fa77-49bc-b161-0e8777c7ef13 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:44.863804573 +0000 UTC m=+2.155993295 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-484d7" (UniqueName: "kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7") pod "network-check-target-cxpcn" (UID: "e37369f9-fa77-49bc-b161-0e8777c7ef13") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:44.367215 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.367191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2qf2\" (UniqueName: \"kubernetes.io/projected/0f691780-d61d-4734-a36c-01d15ac43908-kube-api-access-c2qf2\") pod \"node-resolver-nq959\" (UID: \"0f691780-d61d-4734-a36c-01d15ac43908\") " pod="openshift-dns/node-resolver-nq959" Apr 22 15:58:44.367420 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.367234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsl5t\" (UniqueName: \"kubernetes.io/projected/1ac62026-a809-4e5a-9c42-aa4788090bf2-kube-api-access-dsl5t\") pod \"aws-ebs-csi-driver-node-nc5bd\" (UID: \"1ac62026-a809-4e5a-9c42-aa4788090bf2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.367590 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.367234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzrm4\" (UniqueName: \"kubernetes.io/projected/ea3eaabc-bec9-4b13-b4f1-f400b42ea71a-kube-api-access-xzrm4\") pod \"multus-additional-cni-plugins-kvscs\" (UID: \"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a\") " pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.367803 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.367778 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqq2q\" (UniqueName: \"kubernetes.io/projected/fc9e89c7-0a84-4831-b7e5-5189f30696e3-kube-api-access-lqq2q\") pod \"tuned-vdtff\" (UID: \"fc9e89c7-0a84-4831-b7e5-5189f30696e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.368009 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.367960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbvpb\" (UniqueName: \"kubernetes.io/projected/13a488e0-8f15-4fd1-8913-c002ea52d186-kube-api-access-kbvpb\") pod \"network-metrics-daemon-5v2vn\" (UID: \"13a488e0-8f15-4fd1-8913-c002ea52d186\") " pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:44.369506 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.369485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqc4f\" (UniqueName: \"kubernetes.io/projected/de01641f-09c5-4c2b-a960-1a02116acfd0-kube-api-access-nqc4f\") pod \"iptables-alerter-fnrw7\" (UID: \"de01641f-09c5-4c2b-a960-1a02116acfd0\") " pod="openshift-network-operator/iptables-alerter-fnrw7" Apr 22 15:58:44.369760 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.369743 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8nw9\" (UniqueName: \"kubernetes.io/projected/284f7a34-e743-4f00-9226-bfcbfbabe4a4-kube-api-access-l8nw9\") pod \"node-ca-4gt4d\" (UID: \"284f7a34-e743-4f00-9226-bfcbfbabe4a4\") " pod="openshift-image-registry/node-ca-4gt4d" Apr 22 15:58:44.370180 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.370164 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdc9d\" (UniqueName: \"kubernetes.io/projected/91027f2c-ef91-41c1-a5c4-9c43eba2e5e5-kube-api-access-pdc9d\") pod \"multus-nh7h2\" (UID: \"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5\") " pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.458345 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:44.458307 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode758ab3eb0bee98b6ae04d49ace534ee.slice/crio-26b01ee9da616a1d422de727feecdb016f4935ceaecfdc88098de82cabfbbf22 WatchSource:0}: Error finding container 26b01ee9da616a1d422de727feecdb016f4935ceaecfdc88098de82cabfbbf22: Status 404 returned error can't find the container with id 26b01ee9da616a1d422de727feecdb016f4935ceaecfdc88098de82cabfbbf22 Apr 22 15:58:44.458509 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.458484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-cni-bin\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.458627 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.458541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-node-log\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.458627 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.458571 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2f956ae0-432f-4703-aab0-141a0a0a573c-agent-certs\") pod \"konnectivity-agent-4qb9z\" (UID: \"2f956ae0-432f-4703-aab0-141a0a0a573c\") " pod="kube-system/konnectivity-agent-4qb9z" Apr 22 15:58:44.458627 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.458597 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-cni-bin\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.458742 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.458601 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-slash\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.458742 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.458655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-node-log\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.458742 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.458697 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-slash\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.458742 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.458727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-cni-netd\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.458894 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.458773 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2f956ae0-432f-4703-aab0-141a0a0a573c-konnectivity-ca\") pod \"konnectivity-agent-4qb9z\" (UID: \"2f956ae0-432f-4703-aab0-141a0a0a573c\") " pod="kube-system/konnectivity-agent-4qb9z" Apr 22 15:58:44.458894 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.458804 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-cni-netd\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.458894 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.458829 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-systemd-units\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459037 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.458901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-run-systemd\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459037 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.458936 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-etc-openvswitch\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459037 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.458961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-log-socket\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459037 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.458990 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-run-netns\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459037 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-etc-openvswitch\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459037 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459023 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-log-socket\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459294 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-var-lib-openvswitch\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459294 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459051 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-var-lib-openvswitch\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459294 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459058 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-run-netns\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459294 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.458992 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-run-systemd\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459294 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d3e14dff-8806-4eb4-92e8-68169209c285-ovn-node-metrics-cert\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459294 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-systemd-units\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459294 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-kubelet\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459294 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-run-openvswitch\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459294 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-kubelet\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459294 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459168 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d3e14dff-8806-4eb4-92e8-68169209c285-ovnkube-script-lib\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459294 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-run-openvswitch\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459294 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459253 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-run-ovn-kubernetes\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459830 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459400 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2f956ae0-432f-4703-aab0-141a0a0a573c-konnectivity-ca\") pod \"konnectivity-agent-4qb9z\" (UID: \"2f956ae0-432f-4703-aab0-141a0a0a573c\") " pod="kube-system/konnectivity-agent-4qb9z" Apr 22 15:58:44.459830 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459480 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459830 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459518 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d3e14dff-8806-4eb4-92e8-68169209c285-ovnkube-config\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459830 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d3e14dff-8806-4eb4-92e8-68169209c285-env-overrides\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459830 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459567 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-run-ovn-kubernetes\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459830 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459587 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459830 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459598 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-run-ovn\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.459830 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459640 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmzwb\" (UniqueName: \"kubernetes.io/projected/d3e14dff-8806-4eb4-92e8-68169209c285-kube-api-access-pmzwb\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.460169 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.460080 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d3e14dff-8806-4eb4-92e8-68169209c285-ovnkube-script-lib\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.460331 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.459632 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3e14dff-8806-4eb4-92e8-68169209c285-run-ovn\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.460418 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.460389 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d3e14dff-8806-4eb4-92e8-68169209c285-env-overrides\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.460560 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.460517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d3e14dff-8806-4eb4-92e8-68169209c285-ovnkube-config\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.461678 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.461661 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2f956ae0-432f-4703-aab0-141a0a0a573c-agent-certs\") pod \"konnectivity-agent-4qb9z\" (UID: \"2f956ae0-432f-4703-aab0-141a0a0a573c\") " pod="kube-system/konnectivity-agent-4qb9z" Apr 22 15:58:44.462321 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.462296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d3e14dff-8806-4eb4-92e8-68169209c285-ovn-node-metrics-cert\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.465990 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.464958 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:58:44.468956 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.468926 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmzwb\" (UniqueName: \"kubernetes.io/projected/d3e14dff-8806-4eb4-92e8-68169209c285-kube-api-access-pmzwb\") pod \"ovnkube-node-pxnsg\" (UID: \"d3e14dff-8806-4eb4-92e8-68169209c285\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.472742 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:44.472708 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad918dee123885abb804caebda37d740.slice/crio-f33a8d2727ae29201891ea75fd10ce1ea3055d787116c5163d11bedd28600c2c WatchSource:0}: Error finding container f33a8d2727ae29201891ea75fd10ce1ea3055d787116c5163d11bedd28600c2c: Status 404 returned error can't find the container with id f33a8d2727ae29201891ea75fd10ce1ea3055d787116c5163d11bedd28600c2c Apr 22 15:58:44.561261 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.561211 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nh7h2" Apr 22 15:58:44.568553 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:44.568507 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91027f2c_ef91_41c1_a5c4_9c43eba2e5e5.slice/crio-3a4155f0435af584a33c17fe1d89c5f45a969374d137866355d380ab397365ff WatchSource:0}: Error finding container 3a4155f0435af584a33c17fe1d89c5f45a969374d137866355d380ab397365ff: Status 404 returned error can't find the container with id 3a4155f0435af584a33c17fe1d89c5f45a969374d137866355d380ab397365ff Apr 22 15:58:44.576820 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.576793 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fnrw7" Apr 22 15:58:44.583950 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:44.583917 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde01641f_09c5_4c2b_a960_1a02116acfd0.slice/crio-3bdf372ef5104afafab7907574e9a9210fa709e41bfe5a6aac29aeac60cd9ac9 WatchSource:0}: Error finding container 3bdf372ef5104afafab7907574e9a9210fa709e41bfe5a6aac29aeac60cd9ac9: Status 404 returned error can't find the container with id 3bdf372ef5104afafab7907574e9a9210fa709e41bfe5a6aac29aeac60cd9ac9 Apr 22 15:58:44.604237 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.604201 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" Apr 22 15:58:44.611064 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:44.611034 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ac62026_a809_4e5a_9c42_aa4788090bf2.slice/crio-184a0b9666f74912b24de19935bd93ce436f93cb399f936c1a039e66fb0d201e WatchSource:0}: Error finding container 184a0b9666f74912b24de19935bd93ce436f93cb399f936c1a039e66fb0d201e: Status 404 returned error can't find the container with id 184a0b9666f74912b24de19935bd93ce436f93cb399f936c1a039e66fb0d201e Apr 22 15:58:44.624451 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.624416 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vdtff" Apr 22 15:58:44.632207 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:44.632175 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc9e89c7_0a84_4831_b7e5_5189f30696e3.slice/crio-ccf8c137f5009dac0576e79e18d48fcd2f55c4b2fb67eec00a1f951c97216275 WatchSource:0}: Error finding container ccf8c137f5009dac0576e79e18d48fcd2f55c4b2fb67eec00a1f951c97216275: Status 404 returned error can't find the container with id ccf8c137f5009dac0576e79e18d48fcd2f55c4b2fb67eec00a1f951c97216275 Apr 22 15:58:44.638650 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.638621 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nq959" Apr 22 15:58:44.646028 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:44.645986 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f691780_d61d_4734_a36c_01d15ac43908.slice/crio-af080aac983a134d436903158b5ed4f313535ba9a02783f3fa1e36f7e633ef7b WatchSource:0}: Error finding container af080aac983a134d436903158b5ed4f313535ba9a02783f3fa1e36f7e633ef7b: Status 404 returned error can't find the container with id af080aac983a134d436903158b5ed4f313535ba9a02783f3fa1e36f7e633ef7b Apr 22 15:58:44.653910 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.653883 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4gt4d" Apr 22 15:58:44.662122 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:44.662090 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod284f7a34_e743_4f00_9226_bfcbfbabe4a4.slice/crio-11af1d5e03dfbc0ceac13569ce1dceb280fe6d916b8002fc1578e67fde0362f9 WatchSource:0}: Error finding container 11af1d5e03dfbc0ceac13569ce1dceb280fe6d916b8002fc1578e67fde0362f9: Status 404 returned error can't find the container with id 11af1d5e03dfbc0ceac13569ce1dceb280fe6d916b8002fc1578e67fde0362f9 Apr 22 15:58:44.662262 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.662144 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kvscs" Apr 22 15:58:44.669847 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:44.669818 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea3eaabc_bec9_4b13_b4f1_f400b42ea71a.slice/crio-14cfac7508950408125073c3992697041556a0db5776e82d869745adcef28e9e WatchSource:0}: Error finding container 14cfac7508950408125073c3992697041556a0db5776e82d869745adcef28e9e: Status 404 returned error can't find the container with id 14cfac7508950408125073c3992697041556a0db5776e82d869745adcef28e9e Apr 22 15:58:44.689370 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.689336 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:58:44.695197 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.695166 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4qb9z" Apr 22 15:58:44.696153 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:44.696127 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3e14dff_8806_4eb4_92e8_68169209c285.slice/crio-21834621ea94431a4a154edb2934e42af634a9361cf64deb8f2aa394d6efce17 WatchSource:0}: Error finding container 21834621ea94431a4a154edb2934e42af634a9361cf64deb8f2aa394d6efce17: Status 404 returned error can't find the container with id 21834621ea94431a4a154edb2934e42af634a9361cf64deb8f2aa394d6efce17 Apr 22 15:58:44.702022 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:58:44.701993 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f956ae0_432f_4703_aab0_141a0a0a573c.slice/crio-66875a3268c4de1519c09e4aa63a5eb99ebdab9edb0574b4b8c752052ed9e4aa WatchSource:0}: Error finding container 66875a3268c4de1519c09e4aa63a5eb99ebdab9edb0574b4b8c752052ed9e4aa: Status 404 returned error can't find the container with id 66875a3268c4de1519c09e4aa63a5eb99ebdab9edb0574b4b8c752052ed9e4aa Apr 22 15:58:44.861949 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.861901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs\") pod \"network-metrics-daemon-5v2vn\" (UID: \"13a488e0-8f15-4fd1-8913-c002ea52d186\") " pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:44.862126 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:44.862077 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:44.862184 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:44.862165 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs podName:13a488e0-8f15-4fd1-8913-c002ea52d186 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:45.862142198 +0000 UTC m=+3.154330905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs") pod "network-metrics-daemon-5v2vn" (UID: "13a488e0-8f15-4fd1-8913-c002ea52d186") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:44.953642 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.953564 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:44.962484 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:44.962445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-484d7\" (UniqueName: \"kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7\") pod \"network-check-target-cxpcn\" (UID: \"e37369f9-fa77-49bc-b161-0e8777c7ef13\") " pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:58:44.962758 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:44.962736 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:58:44.962882 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:44.962765 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:58:44.962882 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:44.962779 2576 projected.go:194] Error preparing data for projected volume kube-api-access-484d7 for pod openshift-network-diagnostics/network-check-target-cxpcn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:44.962882 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:44.962844 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7 podName:e37369f9-fa77-49bc-b161-0e8777c7ef13 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:45.962825131 +0000 UTC m=+3.255013848 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-484d7" (UniqueName: "kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7") pod "network-check-target-cxpcn" (UID: "e37369f9-fa77-49bc-b161-0e8777c7ef13") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:45.286124 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.285972 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:53:44 +0000 UTC" deadline="2027-10-12 20:57:33.179149017 +0000 UTC" Apr 22 15:58:45.286124 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.286017 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12916h58m47.893136254s" Apr 22 15:58:45.318819 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.318788 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:45.390189 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.390031 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:45.390189 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:45.390182 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5v2vn" podUID="13a488e0-8f15-4fd1-8913-c002ea52d186" Apr 22 15:58:45.427195 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.426740 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" event={"ID":"ad918dee123885abb804caebda37d740","Type":"ContainerStarted","Data":"f33a8d2727ae29201891ea75fd10ce1ea3055d787116c5163d11bedd28600c2c"} Apr 22 15:58:45.445853 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.445765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kvscs" event={"ID":"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a","Type":"ContainerStarted","Data":"14cfac7508950408125073c3992697041556a0db5776e82d869745adcef28e9e"} Apr 22 15:58:45.452844 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.452758 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4gt4d" event={"ID":"284f7a34-e743-4f00-9226-bfcbfbabe4a4","Type":"ContainerStarted","Data":"11af1d5e03dfbc0ceac13569ce1dceb280fe6d916b8002fc1578e67fde0362f9"} Apr 22 15:58:45.459867 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.459763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nq959" event={"ID":"0f691780-d61d-4734-a36c-01d15ac43908","Type":"ContainerStarted","Data":"af080aac983a134d436903158b5ed4f313535ba9a02783f3fa1e36f7e633ef7b"} Apr 22 15:58:45.463611 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.463299 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:45.478604 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.478476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vdtff" event={"ID":"fc9e89c7-0a84-4831-b7e5-5189f30696e3","Type":"ContainerStarted","Data":"ccf8c137f5009dac0576e79e18d48fcd2f55c4b2fb67eec00a1f951c97216275"} Apr 22 15:58:45.488065 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.487981 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" event={"ID":"1ac62026-a809-4e5a-9c42-aa4788090bf2","Type":"ContainerStarted","Data":"184a0b9666f74912b24de19935bd93ce436f93cb399f936c1a039e66fb0d201e"} Apr 22 15:58:45.507828 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.507754 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fnrw7" event={"ID":"de01641f-09c5-4c2b-a960-1a02116acfd0","Type":"ContainerStarted","Data":"3bdf372ef5104afafab7907574e9a9210fa709e41bfe5a6aac29aeac60cd9ac9"} Apr 22 15:58:45.536797 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.536656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" event={"ID":"e758ab3eb0bee98b6ae04d49ace534ee","Type":"ContainerStarted","Data":"26b01ee9da616a1d422de727feecdb016f4935ceaecfdc88098de82cabfbbf22"} Apr 22 15:58:45.551161 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.551119 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4qb9z" event={"ID":"2f956ae0-432f-4703-aab0-141a0a0a573c","Type":"ContainerStarted","Data":"66875a3268c4de1519c09e4aa63a5eb99ebdab9edb0574b4b8c752052ed9e4aa"} Apr 22 15:58:45.560401 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.560304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" event={"ID":"d3e14dff-8806-4eb4-92e8-68169209c285","Type":"ContainerStarted","Data":"21834621ea94431a4a154edb2934e42af634a9361cf64deb8f2aa394d6efce17"} Apr 22 15:58:45.578577 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.578514 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nh7h2" event={"ID":"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5","Type":"ContainerStarted","Data":"3a4155f0435af584a33c17fe1d89c5f45a969374d137866355d380ab397365ff"} Apr 22 15:58:45.870502 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.869869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs\") pod \"network-metrics-daemon-5v2vn\" (UID: \"13a488e0-8f15-4fd1-8913-c002ea52d186\") " pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:45.870502 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:45.870019 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:45.870502 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:45.870086 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs podName:13a488e0-8f15-4fd1-8913-c002ea52d186 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:47.870065279 +0000 UTC m=+5.162253995 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs") pod "network-metrics-daemon-5v2vn" (UID: "13a488e0-8f15-4fd1-8913-c002ea52d186") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:45.970950 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:45.970295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-484d7\" (UniqueName: \"kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7\") pod \"network-check-target-cxpcn\" (UID: \"e37369f9-fa77-49bc-b161-0e8777c7ef13\") " pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:58:45.970950 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:45.970484 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:58:45.970950 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:45.970504 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:58:45.970950 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:45.970517 2576 projected.go:194] Error preparing data for projected volume kube-api-access-484d7 for pod openshift-network-diagnostics/network-check-target-cxpcn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:45.970950 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:45.970599 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7 podName:e37369f9-fa77-49bc-b161-0e8777c7ef13 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:47.970579604 +0000 UTC m=+5.262768323 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-484d7" (UniqueName: "kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7") pod "network-check-target-cxpcn" (UID: "e37369f9-fa77-49bc-b161-0e8777c7ef13") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:46.286461 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:46.286336 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:53:44 +0000 UTC" deadline="2028-01-25 23:12:43.354525508 +0000 UTC" Apr 22 15:58:46.286461 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:46.286377 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15439h13m57.068153078s" Apr 22 15:58:46.388029 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:46.388001 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:58:46.388197 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:46.388132 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxpcn" podUID="e37369f9-fa77-49bc-b161-0e8777c7ef13" Apr 22 15:58:47.387933 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:47.387666 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:47.388324 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:47.388064 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5v2vn" podUID="13a488e0-8f15-4fd1-8913-c002ea52d186" Apr 22 15:58:47.890446 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:47.889723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs\") pod \"network-metrics-daemon-5v2vn\" (UID: \"13a488e0-8f15-4fd1-8913-c002ea52d186\") " pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:47.890446 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:47.889928 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:47.890446 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:47.889995 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs podName:13a488e0-8f15-4fd1-8913-c002ea52d186 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:51.889976037 +0000 UTC m=+9.182164739 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs") pod "network-metrics-daemon-5v2vn" (UID: "13a488e0-8f15-4fd1-8913-c002ea52d186") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:47.995188 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:47.995144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-484d7\" (UniqueName: \"kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7\") pod \"network-check-target-cxpcn\" (UID: \"e37369f9-fa77-49bc-b161-0e8777c7ef13\") " pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:58:47.995388 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:47.995371 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:58:47.995428 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:47.995397 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:58:47.995428 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:47.995412 2576 projected.go:194] Error preparing data for projected volume kube-api-access-484d7 for pod openshift-network-diagnostics/network-check-target-cxpcn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:47.995497 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:47.995485 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7 podName:e37369f9-fa77-49bc-b161-0e8777c7ef13 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:51.995466455 +0000 UTC m=+9.287655173 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-484d7" (UniqueName: "kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7") pod "network-check-target-cxpcn" (UID: "e37369f9-fa77-49bc-b161-0e8777c7ef13") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:48.387666 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:48.387577 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:58:48.387848 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:48.387714 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxpcn" podUID="e37369f9-fa77-49bc-b161-0e8777c7ef13" Apr 22 15:58:49.389991 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:49.389953 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:49.390483 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:49.390081 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5v2vn" podUID="13a488e0-8f15-4fd1-8913-c002ea52d186" Apr 22 15:58:50.388794 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:50.388376 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:58:50.389556 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:50.389486 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxpcn" podUID="e37369f9-fa77-49bc-b161-0e8777c7ef13" Apr 22 15:58:51.387484 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:51.387448 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:51.387996 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:51.387610 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5v2vn" podUID="13a488e0-8f15-4fd1-8913-c002ea52d186" Apr 22 15:58:51.932157 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:51.932115 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs\") pod \"network-metrics-daemon-5v2vn\" (UID: \"13a488e0-8f15-4fd1-8913-c002ea52d186\") " pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:51.932322 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:51.932305 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:51.932378 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:51.932372 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs podName:13a488e0-8f15-4fd1-8913-c002ea52d186 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:59.932355131 +0000 UTC m=+17.224543832 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs") pod "network-metrics-daemon-5v2vn" (UID: "13a488e0-8f15-4fd1-8913-c002ea52d186") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:52.032837 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:52.032790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-484d7\" (UniqueName: \"kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7\") pod \"network-check-target-cxpcn\" (UID: \"e37369f9-fa77-49bc-b161-0e8777c7ef13\") " pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:58:52.033098 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:52.032995 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:58:52.033098 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:52.033027 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:58:52.033098 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:52.033041 2576 projected.go:194] Error preparing data for projected volume kube-api-access-484d7 for pod openshift-network-diagnostics/network-check-target-cxpcn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:52.033268 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:52.033131 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7 podName:e37369f9-fa77-49bc-b161-0e8777c7ef13 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:00.033110399 +0000 UTC m=+17.325299114 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-484d7" (UniqueName: "kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7") pod "network-check-target-cxpcn" (UID: "e37369f9-fa77-49bc-b161-0e8777c7ef13") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:52.387725 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:52.387644 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:58:52.388101 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:52.387784 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxpcn" podUID="e37369f9-fa77-49bc-b161-0e8777c7ef13" Apr 22 15:58:53.389108 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:53.389070 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:53.389598 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:53.389222 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5v2vn" podUID="13a488e0-8f15-4fd1-8913-c002ea52d186" Apr 22 15:58:54.387757 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:54.387711 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:58:54.387939 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:54.387844 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxpcn" podUID="e37369f9-fa77-49bc-b161-0e8777c7ef13" Apr 22 15:58:55.388447 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:55.388353 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:55.388933 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:55.388480 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5v2vn" podUID="13a488e0-8f15-4fd1-8913-c002ea52d186" Apr 22 15:58:56.388194 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:56.388160 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:58:56.388396 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:56.388303 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxpcn" podUID="e37369f9-fa77-49bc-b161-0e8777c7ef13" Apr 22 15:58:57.388248 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:57.388213 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:57.388703 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:57.388334 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5v2vn" podUID="13a488e0-8f15-4fd1-8913-c002ea52d186" Apr 22 15:58:58.387891 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:58.387850 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:58:58.388058 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:58.387963 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxpcn" podUID="e37369f9-fa77-49bc-b161-0e8777c7ef13" Apr 22 15:58:59.388329 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:59.388285 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:59.388785 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:59.388450 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5v2vn" podUID="13a488e0-8f15-4fd1-8913-c002ea52d186" Apr 22 15:58:59.993885 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:58:59.993847 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs\") pod \"network-metrics-daemon-5v2vn\" (UID: \"13a488e0-8f15-4fd1-8913-c002ea52d186\") " pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:58:59.994089 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:59.994032 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:59.994170 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:58:59.994143 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs podName:13a488e0-8f15-4fd1-8913-c002ea52d186 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:15.994115529 +0000 UTC m=+33.286304251 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs") pod "network-metrics-daemon-5v2vn" (UID: "13a488e0-8f15-4fd1-8913-c002ea52d186") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:59:00.094340 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:00.094305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-484d7\" (UniqueName: \"kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7\") pod \"network-check-target-cxpcn\" (UID: \"e37369f9-fa77-49bc-b161-0e8777c7ef13\") " pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:59:00.094599 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:00.094441 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:59:00.094599 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:00.094462 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:59:00.094599 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:00.094473 2576 projected.go:194] Error preparing data for projected volume kube-api-access-484d7 for pod openshift-network-diagnostics/network-check-target-cxpcn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:59:00.094599 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:00.094551 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7 podName:e37369f9-fa77-49bc-b161-0e8777c7ef13 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:16.094514194 +0000 UTC m=+33.386702901 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-484d7" (UniqueName: "kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7") pod "network-check-target-cxpcn" (UID: "e37369f9-fa77-49bc-b161-0e8777c7ef13") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:59:00.387753 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:00.387653 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:59:00.387936 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:00.387778 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxpcn" podUID="e37369f9-fa77-49bc-b161-0e8777c7ef13" Apr 22 15:59:01.388114 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:01.388068 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:59:01.388690 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:01.388226 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5v2vn" podUID="13a488e0-8f15-4fd1-8913-c002ea52d186" Apr 22 15:59:02.387752 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:02.387709 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:59:02.387953 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:02.387857 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxpcn" podUID="e37369f9-fa77-49bc-b161-0e8777c7ef13" Apr 22 15:59:03.387967 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:03.387937 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:59:03.388262 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:03.388046 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5v2vn" podUID="13a488e0-8f15-4fd1-8913-c002ea52d186" Apr 22 15:59:04.387852 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.387663 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:59:04.388034 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:04.387924 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxpcn" podUID="e37369f9-fa77-49bc-b161-0e8777c7ef13" Apr 22 15:59:04.639280 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.638944 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" event={"ID":"1ac62026-a809-4e5a-9c42-aa4788090bf2","Type":"ContainerStarted","Data":"e12fd728f7bdd206ec56fab74792361cdee74e89198f9b7876bf16f859df0b3c"} Apr 22 15:59:04.640446 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.640418 2576 generic.go:358] "Generic (PLEG): container finished" podID="e758ab3eb0bee98b6ae04d49ace534ee" containerID="019447e7534a89ae9c9b1e1c0da2e50be2a7f5c133fe5151a8755d1d5625abe6" exitCode=0 Apr 22 15:59:04.640587 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.640493 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" event={"ID":"e758ab3eb0bee98b6ae04d49ace534ee","Type":"ContainerDied","Data":"019447e7534a89ae9c9b1e1c0da2e50be2a7f5c133fe5151a8755d1d5625abe6"} Apr 22 15:59:04.643342 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.642414 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4qb9z" event={"ID":"2f956ae0-432f-4703-aab0-141a0a0a573c","Type":"ContainerStarted","Data":"8f82631979477d3e21e1a265428ea07b6448c63a657171ab4378839970c7b723"} Apr 22 15:59:04.647020 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.647000 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/ovn-acl-logging/0.log" Apr 22 15:59:04.647322 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.647300 2576 generic.go:358] "Generic (PLEG): container finished" podID="d3e14dff-8806-4eb4-92e8-68169209c285" containerID="9a5ac275914730aecec2c8d5347e8224c0dced3ec4405172e2a54ba37b80129c" exitCode=1 Apr 22 15:59:04.647409 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.647364 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" event={"ID":"d3e14dff-8806-4eb4-92e8-68169209c285","Type":"ContainerStarted","Data":"4658dcf6898b7e61030a127b888afe36d506d40c538cc73daea979402bc39382"} Apr 22 15:59:04.647409 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.647392 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" event={"ID":"d3e14dff-8806-4eb4-92e8-68169209c285","Type":"ContainerStarted","Data":"b0e3ea2d2fcf461efaa2ce89e2f21f8258242090ecd60ac81ab3a62541197461"} Apr 22 15:59:04.647409 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.647401 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" event={"ID":"d3e14dff-8806-4eb4-92e8-68169209c285","Type":"ContainerStarted","Data":"5433ea4705d165ed167085dff9be00b92d057bef16596ddd5089ea312d162914"} Apr 22 15:59:04.647579 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.647410 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" event={"ID":"d3e14dff-8806-4eb4-92e8-68169209c285","Type":"ContainerStarted","Data":"9a1284c6f1dd11ff2e5e80bb61d1864ec25f3a546b74a9a3e41483099873e60a"} Apr 22 15:59:04.647579 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.647423 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" event={"ID":"d3e14dff-8806-4eb4-92e8-68169209c285","Type":"ContainerDied","Data":"9a5ac275914730aecec2c8d5347e8224c0dced3ec4405172e2a54ba37b80129c"} Apr 22 15:59:04.647579 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.647439 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" event={"ID":"d3e14dff-8806-4eb4-92e8-68169209c285","Type":"ContainerStarted","Data":"0d12c55926c48ba662206b6d77fba2c76c6097bea472913a8397518cadbba8f7"} Apr 22 15:59:04.648727 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.648706 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nh7h2" event={"ID":"91027f2c-ef91-41c1-a5c4-9c43eba2e5e5","Type":"ContainerStarted","Data":"3ae0da2bfb10fb2d26e8b6ae52d677d189aa7eb68989353c32b18d124b912f13"} Apr 22 15:59:04.649967 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.649942 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" event={"ID":"ad918dee123885abb804caebda37d740","Type":"ContainerStarted","Data":"ba04384709088ba7f351df9152f856629d23f2761b146f03222085e0fa16e656"} Apr 22 15:59:04.651369 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.651343 2576 generic.go:358] "Generic (PLEG): container finished" podID="ea3eaabc-bec9-4b13-b4f1-f400b42ea71a" containerID="c6a0beacda26abf6aec2631e5aafceea7c07eb84fdebaa844c5fdee131d8064d" exitCode=0 Apr 22 15:59:04.651470 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.651420 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kvscs" event={"ID":"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a","Type":"ContainerDied","Data":"c6a0beacda26abf6aec2631e5aafceea7c07eb84fdebaa844c5fdee131d8064d"} Apr 22 15:59:04.652841 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.652814 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4gt4d" event={"ID":"284f7a34-e743-4f00-9226-bfcbfbabe4a4","Type":"ContainerStarted","Data":"5bb3811951c6c929781c18cbe4b6584bf6e261385e6baea01e7ffc5fd870d4ef"} Apr 22 15:59:04.654721 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.654377 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nq959" event={"ID":"0f691780-d61d-4734-a36c-01d15ac43908","Type":"ContainerStarted","Data":"210b6cb2175c50f5182591c885dc525e78836a067314befda8062bc618fc706f"} Apr 22 15:59:04.655760 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.655739 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vdtff" event={"ID":"fc9e89c7-0a84-4831-b7e5-5189f30696e3","Type":"ContainerStarted","Data":"c30fa79300ea3027a99ebef44e393216654751a07d4dcbf137edbe1cbaaa9d9c"} Apr 22 15:59:04.669969 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.669903 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" podStartSLOduration=21.669881141 podStartE2EDuration="21.669881141s" podCreationTimestamp="2026-04-22 15:58:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:59:04.668985347 +0000 UTC m=+21.961174081" watchObservedRunningTime="2026-04-22 15:59:04.669881141 +0000 UTC m=+21.962069867" Apr 22 15:59:04.705911 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.705852 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4qb9z" podStartSLOduration=3.084677684 podStartE2EDuration="21.705836896s" podCreationTimestamp="2026-04-22 15:58:43 +0000 UTC" firstStartedPulling="2026-04-22 15:58:44.70359217 +0000 UTC m=+1.995780872" lastFinishedPulling="2026-04-22 15:59:03.324751363 +0000 UTC m=+20.616940084" observedRunningTime="2026-04-22 15:59:04.705496425 +0000 UTC m=+21.997685149" watchObservedRunningTime="2026-04-22 15:59:04.705836896 +0000 UTC m=+21.998025635" Apr 22 15:59:04.720138 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.720071 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nh7h2" podStartSLOduration=2.9315880500000002 podStartE2EDuration="21.720055758s" podCreationTimestamp="2026-04-22 15:58:43 +0000 UTC" firstStartedPulling="2026-04-22 15:58:44.570359419 +0000 UTC m=+1.862548121" lastFinishedPulling="2026-04-22 15:59:03.358827128 +0000 UTC m=+20.651015829" observedRunningTime="2026-04-22 15:59:04.719717457 +0000 UTC m=+22.011906181" watchObservedRunningTime="2026-04-22 15:59:04.720055758 +0000 UTC m=+22.012244482" Apr 22 15:59:04.732282 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.732232 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-nq959" podStartSLOduration=3.05627845 podStartE2EDuration="21.732217946s" podCreationTimestamp="2026-04-22 15:58:43 +0000 UTC" firstStartedPulling="2026-04-22 15:58:44.649430185 +0000 UTC m=+1.941618892" lastFinishedPulling="2026-04-22 15:59:03.325369672 +0000 UTC m=+20.617558388" observedRunningTime="2026-04-22 15:59:04.732133873 +0000 UTC m=+22.024322596" watchObservedRunningTime="2026-04-22 15:59:04.732217946 +0000 UTC m=+22.024406669" Apr 22 15:59:04.744985 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.744927 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4gt4d" podStartSLOduration=3.083042201 podStartE2EDuration="21.74490855s" podCreationTimestamp="2026-04-22 15:58:43 +0000 UTC" firstStartedPulling="2026-04-22 15:58:44.663798803 +0000 UTC m=+1.955987505" lastFinishedPulling="2026-04-22 15:59:03.325665138 +0000 UTC m=+20.617853854" observedRunningTime="2026-04-22 15:59:04.744332591 +0000 UTC m=+22.036521315" watchObservedRunningTime="2026-04-22 15:59:04.74490855 +0000 UTC m=+22.037097273" Apr 22 15:59:04.758729 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.758655 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vdtff" podStartSLOduration=3.065784933 podStartE2EDuration="21.75863456s" podCreationTimestamp="2026-04-22 15:58:43 +0000 UTC" firstStartedPulling="2026-04-22 15:58:44.634060342 +0000 UTC m=+1.926249044" lastFinishedPulling="2026-04-22 15:59:03.326909962 +0000 UTC m=+20.619098671" observedRunningTime="2026-04-22 15:59:04.758305925 +0000 UTC m=+22.050494664" watchObservedRunningTime="2026-04-22 15:59:04.75863456 +0000 UTC m=+22.050823286" Apr 22 15:59:04.916746 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:04.916642 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4qb9z" Apr 22 15:59:05.390025 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:05.388375 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:59:05.390025 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:05.388508 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5v2vn" podUID="13a488e0-8f15-4fd1-8913-c002ea52d186" Apr 22 15:59:05.508055 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:05.508020 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 15:59:05.659704 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:05.659615 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" event={"ID":"1ac62026-a809-4e5a-9c42-aa4788090bf2","Type":"ContainerStarted","Data":"c3326e882c3faa2182460095ad380ac74921cae2fa58b5a98490673fa680f48b"} Apr 22 15:59:05.661126 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:05.661083 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fnrw7" event={"ID":"de01641f-09c5-4c2b-a960-1a02116acfd0","Type":"ContainerStarted","Data":"a8fef5e065079c613be3e15c394f313ae04f0eed7c2af3f1e1986ec673499e22"} Apr 22 15:59:05.663688 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:05.662989 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" event={"ID":"e758ab3eb0bee98b6ae04d49ace534ee","Type":"ContainerStarted","Data":"21c4f2909ece134f7799f598e6248d832a6a0bdfd46673ba2ee36ff796209937"} Apr 22 15:59:05.673540 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:05.673470 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fnrw7" podStartSLOduration=3.934105249 podStartE2EDuration="22.673450839s" podCreationTimestamp="2026-04-22 15:58:43 +0000 UTC" firstStartedPulling="2026-04-22 15:58:44.586292252 +0000 UTC m=+1.878480954" lastFinishedPulling="2026-04-22 15:59:03.325637841 +0000 UTC m=+20.617826544" observedRunningTime="2026-04-22 15:59:05.673325229 +0000 UTC m=+22.965513953" watchObservedRunningTime="2026-04-22 15:59:05.673450839 +0000 UTC m=+22.965639565" Apr 22 15:59:05.687179 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:05.687126 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" podStartSLOduration=22.68711012 podStartE2EDuration="22.68711012s" podCreationTimestamp="2026-04-22 15:58:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:59:05.686701928 +0000 UTC m=+22.978890652" watchObservedRunningTime="2026-04-22 15:59:05.68711012 +0000 UTC m=+22.979298844" Apr 22 15:59:06.326443 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:06.326341 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T15:59:05.508035702Z","UUID":"f6ed5c44-237f-4310-b422-d96a89da8107","Handler":null,"Name":"","Endpoint":""} Apr 22 15:59:06.328227 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:06.328201 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 15:59:06.328382 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:06.328236 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 15:59:06.387620 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:06.387578 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:59:06.387804 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:06.387737 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxpcn" podUID="e37369f9-fa77-49bc-b161-0e8777c7ef13" Apr 22 15:59:06.667392 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:06.667286 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" event={"ID":"1ac62026-a809-4e5a-9c42-aa4788090bf2","Type":"ContainerStarted","Data":"1227bb334c5dbc0b2f7803327b240ea3b91d2e94b7b6f1763c523a7bc236808a"} Apr 22 15:59:06.670864 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:06.670834 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/ovn-acl-logging/0.log" Apr 22 15:59:06.671363 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:06.671325 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" event={"ID":"d3e14dff-8806-4eb4-92e8-68169209c285","Type":"ContainerStarted","Data":"e369d2a8a12ca138b61376f7ed1b871673dc5321fef08db46a026f34a5f39014"} Apr 22 15:59:06.683762 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:06.683642 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nc5bd" podStartSLOduration=1.9781450870000001 podStartE2EDuration="23.683575517s" podCreationTimestamp="2026-04-22 15:58:43 +0000 UTC" firstStartedPulling="2026-04-22 15:58:44.612872399 +0000 UTC m=+1.905061101" lastFinishedPulling="2026-04-22 15:59:06.318302815 +0000 UTC m=+23.610491531" observedRunningTime="2026-04-22 15:59:06.682625062 +0000 UTC m=+23.974813801" watchObservedRunningTime="2026-04-22 15:59:06.683575517 +0000 UTC m=+23.975764236" Apr 22 15:59:06.956422 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:06.956328 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4qb9z" Apr 22 15:59:06.957236 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:06.957200 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4qb9z" Apr 22 15:59:07.387754 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:07.387659 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:59:07.387925 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:07.387822 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5v2vn" podUID="13a488e0-8f15-4fd1-8913-c002ea52d186" Apr 22 15:59:07.674821 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:07.674727 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4qb9z" Apr 22 15:59:08.388409 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:08.388195 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:59:08.388623 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:08.388511 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxpcn" podUID="e37369f9-fa77-49bc-b161-0e8777c7ef13" Apr 22 15:59:09.390947 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:09.390770 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:59:09.391638 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:09.391031 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5v2vn" podUID="13a488e0-8f15-4fd1-8913-c002ea52d186" Apr 22 15:59:09.678264 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:09.678176 2576 generic.go:358] "Generic (PLEG): container finished" podID="ea3eaabc-bec9-4b13-b4f1-f400b42ea71a" containerID="459be90a5767ec5262408467d784fcc7a11f49b0b9d7ffd3b9179672b8a265d0" exitCode=0 Apr 22 15:59:09.678440 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:09.678268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kvscs" event={"ID":"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a","Type":"ContainerDied","Data":"459be90a5767ec5262408467d784fcc7a11f49b0b9d7ffd3b9179672b8a265d0"} Apr 22 15:59:09.681406 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:09.681380 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/ovn-acl-logging/0.log" Apr 22 15:59:09.681778 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:09.681750 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" event={"ID":"d3e14dff-8806-4eb4-92e8-68169209c285","Type":"ContainerStarted","Data":"0b5cbd4cf745c63bafd7abe8bf6bf1ffe7588ee1307ac078b6c512ed90d0645a"} Apr 22 15:59:09.682337 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:09.682311 2576 scope.go:117] "RemoveContainer" containerID="9a5ac275914730aecec2c8d5347e8224c0dced3ec4405172e2a54ba37b80129c" Apr 22 15:59:10.388133 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:10.388098 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:59:10.388316 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:10.388234 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxpcn" podUID="e37369f9-fa77-49bc-b161-0e8777c7ef13" Apr 22 15:59:10.686948 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:10.686861 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/ovn-acl-logging/0.log" Apr 22 15:59:10.687356 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:10.687202 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" event={"ID":"d3e14dff-8806-4eb4-92e8-68169209c285","Type":"ContainerStarted","Data":"1942ba525596528d049fdb3c5b00efb826c758da55f26998f08ebe3681be8cc5"} Apr 22 15:59:10.687461 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:10.687443 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:59:10.687507 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:10.687473 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:59:10.687507 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:10.687483 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:59:10.689843 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:10.689808 2576 generic.go:358] "Generic (PLEG): container finished" podID="ea3eaabc-bec9-4b13-b4f1-f400b42ea71a" containerID="d0daeaf716d2e3412fdd441e72ff003bfc8c722aab5c5f1a3eccfe4267982367" exitCode=0 Apr 22 15:59:10.689984 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:10.689872 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kvscs" event={"ID":"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a","Type":"ContainerDied","Data":"d0daeaf716d2e3412fdd441e72ff003bfc8c722aab5c5f1a3eccfe4267982367"} Apr 22 15:59:10.708653 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:10.708264 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5v2vn"] Apr 22 15:59:10.708653 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:10.708462 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:59:10.709494 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:10.709087 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5v2vn" podUID="13a488e0-8f15-4fd1-8913-c002ea52d186" Apr 22 15:59:10.709667 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:10.709637 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:59:10.709792 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:10.709684 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cxpcn"] Apr 22 15:59:10.709854 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:10.709795 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:59:10.710364 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:10.709907 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxpcn" podUID="e37369f9-fa77-49bc-b161-0e8777c7ef13" Apr 22 15:59:10.710874 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:10.710853 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:59:10.716141 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:10.716079 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" podStartSLOduration=8.704108716 podStartE2EDuration="27.716060572s" podCreationTimestamp="2026-04-22 15:58:43 +0000 UTC" firstStartedPulling="2026-04-22 15:58:44.698773278 +0000 UTC m=+1.990961980" lastFinishedPulling="2026-04-22 15:59:03.710725131 +0000 UTC m=+21.002913836" observedRunningTime="2026-04-22 15:59:10.715050951 +0000 UTC m=+28.007239688" watchObservedRunningTime="2026-04-22 15:59:10.716060572 +0000 UTC m=+28.008249297" Apr 22 15:59:11.694279 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:11.694185 2576 generic.go:358] "Generic (PLEG): container finished" podID="ea3eaabc-bec9-4b13-b4f1-f400b42ea71a" containerID="bdcc52a61c343cd30769651fdf694d4224c8ffd0a0152b8c5a799671d6bf3b52" exitCode=0 Apr 22 15:59:11.694279 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:11.694247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kvscs" event={"ID":"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a","Type":"ContainerDied","Data":"bdcc52a61c343cd30769651fdf694d4224c8ffd0a0152b8c5a799671d6bf3b52"} Apr 22 15:59:12.388302 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:12.388259 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:59:12.388508 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:12.388401 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5v2vn" podUID="13a488e0-8f15-4fd1-8913-c002ea52d186" Apr 22 15:59:12.388508 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:12.388464 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:59:12.388663 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:12.388606 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxpcn" podUID="e37369f9-fa77-49bc-b161-0e8777c7ef13" Apr 22 15:59:14.387937 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:14.387657 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:59:14.387937 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:14.387677 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:59:14.388615 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:14.387969 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxpcn" podUID="e37369f9-fa77-49bc-b161-0e8777c7ef13" Apr 22 15:59:14.388615 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:14.388052 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5v2vn" podUID="13a488e0-8f15-4fd1-8913-c002ea52d186" Apr 22 15:59:15.468382 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.468353 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeReady" Apr 22 15:59:15.468905 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.468519 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 15:59:15.507213 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.507168 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-njwkw"] Apr 22 15:59:15.534046 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.534012 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-57g78"] Apr 22 15:59:15.534230 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.534179 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:15.536739 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.536641 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 15:59:15.536739 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.536664 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 15:59:15.536955 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.536650 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lm8hj\"" Apr 22 15:59:15.559466 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.559373 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-njwkw"] Apr 22 15:59:15.559466 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.559410 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-57g78"] Apr 22 15:59:15.559693 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.559573 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-57g78" Apr 22 15:59:15.561970 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.561945 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 15:59:15.562123 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.561969 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbbp8\"" Apr 22 15:59:15.562123 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.561944 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 15:59:15.562123 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.562010 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 15:59:15.608677 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.608633 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c54ba0ce-4b35-4707-8ee7-608cb358834b-config-volume\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:15.608677 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.608688 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:15.608906 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.608742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9vw4\" (UniqueName: \"kubernetes.io/projected/c54ba0ce-4b35-4707-8ee7-608cb358834b-kube-api-access-l9vw4\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:15.608906 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.608774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c54ba0ce-4b35-4707-8ee7-608cb358834b-tmp-dir\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:15.709106 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.709067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9vw4\" (UniqueName: \"kubernetes.io/projected/c54ba0ce-4b35-4707-8ee7-608cb358834b-kube-api-access-l9vw4\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:15.709320 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.709145 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c54ba0ce-4b35-4707-8ee7-608cb358834b-tmp-dir\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:15.709320 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.709193 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c54ba0ce-4b35-4707-8ee7-608cb358834b-config-volume\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:15.709320 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.709218 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj78k\" (UniqueName: \"kubernetes.io/projected/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-kube-api-access-vj78k\") pod \"ingress-canary-57g78\" (UID: \"bb23ac61-dcc3-40eb-a485-4e58d6ea6d04\") " pod="openshift-ingress-canary/ingress-canary-57g78" Apr 22 15:59:15.709320 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.709258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:15.709320 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.709283 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert\") pod \"ingress-canary-57g78\" (UID: \"bb23ac61-dcc3-40eb-a485-4e58d6ea6d04\") " pod="openshift-ingress-canary/ingress-canary-57g78" Apr 22 15:59:15.709600 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:15.709443 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:15.709600 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:15.709520 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls podName:c54ba0ce-4b35-4707-8ee7-608cb358834b nodeName:}" failed. No retries permitted until 2026-04-22 15:59:16.209495769 +0000 UTC m=+33.501685016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls") pod "dns-default-njwkw" (UID: "c54ba0ce-4b35-4707-8ee7-608cb358834b") : secret "dns-default-metrics-tls" not found Apr 22 15:59:15.709700 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.709688 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c54ba0ce-4b35-4707-8ee7-608cb358834b-tmp-dir\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:15.710200 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.710163 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c54ba0ce-4b35-4707-8ee7-608cb358834b-config-volume\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:15.721141 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.721097 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9vw4\" (UniqueName: \"kubernetes.io/projected/c54ba0ce-4b35-4707-8ee7-608cb358834b-kube-api-access-l9vw4\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:15.810572 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.810451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vj78k\" (UniqueName: \"kubernetes.io/projected/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-kube-api-access-vj78k\") pod \"ingress-canary-57g78\" (UID: \"bb23ac61-dcc3-40eb-a485-4e58d6ea6d04\") " pod="openshift-ingress-canary/ingress-canary-57g78" Apr 22 15:59:15.810572 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.810552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert\") pod \"ingress-canary-57g78\" (UID: \"bb23ac61-dcc3-40eb-a485-4e58d6ea6d04\") " pod="openshift-ingress-canary/ingress-canary-57g78" Apr 22 15:59:15.810817 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:15.810690 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:15.810817 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:15.810762 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert podName:bb23ac61-dcc3-40eb-a485-4e58d6ea6d04 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:16.310742068 +0000 UTC m=+33.602930770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert") pod "ingress-canary-57g78" (UID: "bb23ac61-dcc3-40eb-a485-4e58d6ea6d04") : secret "canary-serving-cert" not found Apr 22 15:59:15.819626 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:15.819583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj78k\" (UniqueName: \"kubernetes.io/projected/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-kube-api-access-vj78k\") pod \"ingress-canary-57g78\" (UID: \"bb23ac61-dcc3-40eb-a485-4e58d6ea6d04\") " pod="openshift-ingress-canary/ingress-canary-57g78" Apr 22 15:59:16.012156 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:16.012115 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs\") pod \"network-metrics-daemon-5v2vn\" (UID: \"13a488e0-8f15-4fd1-8913-c002ea52d186\") " pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:59:16.012356 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:16.012293 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:59:16.012409 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:16.012381 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs podName:13a488e0-8f15-4fd1-8913-c002ea52d186 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:48.012358894 +0000 UTC m=+65.304547599 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs") pod "network-metrics-daemon-5v2vn" (UID: "13a488e0-8f15-4fd1-8913-c002ea52d186") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:59:16.113590 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:16.113479 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-484d7\" (UniqueName: \"kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7\") pod \"network-check-target-cxpcn\" (UID: \"e37369f9-fa77-49bc-b161-0e8777c7ef13\") " pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:59:16.113767 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:16.113654 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:59:16.113767 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:16.113672 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:59:16.113767 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:16.113683 2576 projected.go:194] Error preparing data for projected volume kube-api-access-484d7 for pod openshift-network-diagnostics/network-check-target-cxpcn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:59:16.113767 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:16.113750 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7 podName:e37369f9-fa77-49bc-b161-0e8777c7ef13 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:48.113732876 +0000 UTC m=+65.405921603 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-484d7" (UniqueName: "kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7") pod "network-check-target-cxpcn" (UID: "e37369f9-fa77-49bc-b161-0e8777c7ef13") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:59:16.214211 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:16.214174 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:16.214412 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:16.214345 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:16.214470 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:16.214430 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls podName:c54ba0ce-4b35-4707-8ee7-608cb358834b nodeName:}" failed. No retries permitted until 2026-04-22 15:59:17.214406181 +0000 UTC m=+34.506594888 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls") pod "dns-default-njwkw" (UID: "c54ba0ce-4b35-4707-8ee7-608cb358834b") : secret "dns-default-metrics-tls" not found Apr 22 15:59:16.315266 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:16.315227 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert\") pod \"ingress-canary-57g78\" (UID: \"bb23ac61-dcc3-40eb-a485-4e58d6ea6d04\") " pod="openshift-ingress-canary/ingress-canary-57g78" Apr 22 15:59:16.315461 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:16.315385 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:16.315555 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:16.315467 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert podName:bb23ac61-dcc3-40eb-a485-4e58d6ea6d04 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:17.315446856 +0000 UTC m=+34.607635557 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert") pod "ingress-canary-57g78" (UID: "bb23ac61-dcc3-40eb-a485-4e58d6ea6d04") : secret "canary-serving-cert" not found Apr 22 15:59:16.387735 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:16.387642 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:59:16.387894 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:16.387649 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:59:16.390169 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:16.390127 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-52jwv\"" Apr 22 15:59:16.390307 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:16.390174 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 15:59:16.390375 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:16.390358 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 15:59:16.390428 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:16.390388 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jgt28\"" Apr 22 15:59:16.390709 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:16.390682 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 15:59:17.221928 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:17.221884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:17.222832 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:17.222059 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:17.222832 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:17.222142 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls podName:c54ba0ce-4b35-4707-8ee7-608cb358834b nodeName:}" failed. No retries permitted until 2026-04-22 15:59:19.222116074 +0000 UTC m=+36.514304777 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls") pod "dns-default-njwkw" (UID: "c54ba0ce-4b35-4707-8ee7-608cb358834b") : secret "dns-default-metrics-tls" not found Apr 22 15:59:17.323097 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:17.323056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert\") pod \"ingress-canary-57g78\" (UID: \"bb23ac61-dcc3-40eb-a485-4e58d6ea6d04\") " pod="openshift-ingress-canary/ingress-canary-57g78" Apr 22 15:59:17.323279 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:17.323232 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:17.323320 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:17.323302 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert podName:bb23ac61-dcc3-40eb-a485-4e58d6ea6d04 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:19.323286428 +0000 UTC m=+36.615475129 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert") pod "ingress-canary-57g78" (UID: "bb23ac61-dcc3-40eb-a485-4e58d6ea6d04") : secret "canary-serving-cert" not found Apr 22 15:59:18.712235 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:18.712193 2576 generic.go:358] "Generic (PLEG): container finished" podID="ea3eaabc-bec9-4b13-b4f1-f400b42ea71a" containerID="f9fc6a806c3f7250a8ea9f3cf7844776a46266201f43ca43082eb59c1f8095a1" exitCode=0 Apr 22 15:59:18.712652 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:18.712264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kvscs" event={"ID":"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a","Type":"ContainerDied","Data":"f9fc6a806c3f7250a8ea9f3cf7844776a46266201f43ca43082eb59c1f8095a1"} Apr 22 15:59:19.236898 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:19.236861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:19.237067 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:19.237009 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:19.237126 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:19.237093 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls podName:c54ba0ce-4b35-4707-8ee7-608cb358834b nodeName:}" failed. No retries permitted until 2026-04-22 15:59:23.237077023 +0000 UTC m=+40.529265724 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls") pod "dns-default-njwkw" (UID: "c54ba0ce-4b35-4707-8ee7-608cb358834b") : secret "dns-default-metrics-tls" not found Apr 22 15:59:19.337582 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:19.337523 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert\") pod \"ingress-canary-57g78\" (UID: \"bb23ac61-dcc3-40eb-a485-4e58d6ea6d04\") " pod="openshift-ingress-canary/ingress-canary-57g78" Apr 22 15:59:19.337734 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:19.337646 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:19.337734 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:19.337722 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert podName:bb23ac61-dcc3-40eb-a485-4e58d6ea6d04 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:23.337702887 +0000 UTC m=+40.629891588 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert") pod "ingress-canary-57g78" (UID: "bb23ac61-dcc3-40eb-a485-4e58d6ea6d04") : secret "canary-serving-cert" not found Apr 22 15:59:19.717834 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:19.717554 2576 generic.go:358] "Generic (PLEG): container finished" podID="ea3eaabc-bec9-4b13-b4f1-f400b42ea71a" containerID="dbe298db3f424cf43c785a53b32f60100e06ecb7aeac65a5ae62764899e65aad" exitCode=0 Apr 22 15:59:19.718223 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:19.717638 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kvscs" event={"ID":"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a","Type":"ContainerDied","Data":"dbe298db3f424cf43c785a53b32f60100e06ecb7aeac65a5ae62764899e65aad"} Apr 22 15:59:20.722736 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:20.722698 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kvscs" event={"ID":"ea3eaabc-bec9-4b13-b4f1-f400b42ea71a","Type":"ContainerStarted","Data":"c7f7ed24344e3618bafc66505adcd40220526d2773f9c5962b2aa5c75b27f942"} Apr 22 15:59:20.744178 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:20.744124 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kvscs" podStartSLOduration=4.720468394 podStartE2EDuration="37.744107356s" podCreationTimestamp="2026-04-22 15:58:43 +0000 UTC" firstStartedPulling="2026-04-22 15:58:44.671454844 +0000 UTC m=+1.963643546" lastFinishedPulling="2026-04-22 15:59:17.695093804 +0000 UTC m=+34.987282508" observedRunningTime="2026-04-22 15:59:20.742619169 +0000 UTC m=+38.034807893" watchObservedRunningTime="2026-04-22 15:59:20.744107356 +0000 UTC m=+38.036296079" Apr 22 15:59:23.264089 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:23.264016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:23.264492 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:23.264209 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:23.264492 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:23.264277 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls podName:c54ba0ce-4b35-4707-8ee7-608cb358834b nodeName:}" failed. No retries permitted until 2026-04-22 15:59:31.264261223 +0000 UTC m=+48.556449925 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls") pod "dns-default-njwkw" (UID: "c54ba0ce-4b35-4707-8ee7-608cb358834b") : secret "dns-default-metrics-tls" not found Apr 22 15:59:23.365020 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:23.364978 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert\") pod \"ingress-canary-57g78\" (UID: \"bb23ac61-dcc3-40eb-a485-4e58d6ea6d04\") " pod="openshift-ingress-canary/ingress-canary-57g78" Apr 22 15:59:23.365231 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:23.365113 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:23.365231 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:23.365182 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert podName:bb23ac61-dcc3-40eb-a485-4e58d6ea6d04 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:31.365161615 +0000 UTC m=+48.657350323 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert") pod "ingress-canary-57g78" (UID: "bb23ac61-dcc3-40eb-a485-4e58d6ea6d04") : secret "canary-serving-cert" not found Apr 22 15:59:31.321307 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:31.321263 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:31.321846 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:31.321447 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:31.321846 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:31.321569 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls podName:c54ba0ce-4b35-4707-8ee7-608cb358834b nodeName:}" failed. No retries permitted until 2026-04-22 15:59:47.32152229 +0000 UTC m=+64.613711011 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls") pod "dns-default-njwkw" (UID: "c54ba0ce-4b35-4707-8ee7-608cb358834b") : secret "dns-default-metrics-tls" not found Apr 22 15:59:31.422488 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:31.422446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert\") pod \"ingress-canary-57g78\" (UID: \"bb23ac61-dcc3-40eb-a485-4e58d6ea6d04\") " pod="openshift-ingress-canary/ingress-canary-57g78" Apr 22 15:59:31.422776 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:31.422646 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:31.422776 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:31.422726 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert podName:bb23ac61-dcc3-40eb-a485-4e58d6ea6d04 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:47.422705629 +0000 UTC m=+64.714894343 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert") pod "ingress-canary-57g78" (UID: "bb23ac61-dcc3-40eb-a485-4e58d6ea6d04") : secret "canary-serving-cert" not found Apr 22 15:59:42.708814 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:42.708780 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxnsg" Apr 22 15:59:46.580588 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.580546 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq"] Apr 22 15:59:46.584914 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.584885 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7dd4c648bd-28stq"] Apr 22 15:59:46.585069 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.585034 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" Apr 22 15:59:46.590698 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.590655 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 15:59:46.590890 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.590700 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 15:59:46.590890 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.590719 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 15:59:46.590890 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.590746 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 15:59:46.590890 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.590760 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-76g5l\"" Apr 22 15:59:46.591231 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.591215 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:46.605663 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.605633 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 15:59:46.605842 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.605685 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-5cpxz\"" Apr 22 15:59:46.605842 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.605707 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 15:59:46.605842 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.605744 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 15:59:46.606006 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.605633 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 15:59:46.606006 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.605969 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 15:59:46.606194 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.606170 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 15:59:46.606447 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.606430 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq"] Apr 22 15:59:46.607200 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.607181 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7dd4c648bd-28stq"] Apr 22 15:59:46.735647 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.735603 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-stats-auth\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:46.735857 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.735654 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-grltq\" (UID: \"770ecfe6-c426-49c5-aa5c-da656e2dc3c3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" Apr 22 15:59:46.735857 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.735684 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:46.735857 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.735709 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bl9w\" (UniqueName: \"kubernetes.io/projected/a0f34ab0-9da5-40dc-8537-84795ca7da5f-kube-api-access-7bl9w\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:46.735857 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.735756 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mzpd\" (UniqueName: \"kubernetes.io/projected/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-kube-api-access-2mzpd\") pod \"cluster-monitoring-operator-75587bd455-grltq\" (UID: \"770ecfe6-c426-49c5-aa5c-da656e2dc3c3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" Apr 22 15:59:46.736042 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.735870 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-grltq\" (UID: \"770ecfe6-c426-49c5-aa5c-da656e2dc3c3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" Apr 22 15:59:46.736042 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.735913 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-default-certificate\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:46.736042 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.735938 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:46.836700 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.836596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-grltq\" (UID: \"770ecfe6-c426-49c5-aa5c-da656e2dc3c3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" Apr 22 15:59:46.836700 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.836658 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-default-certificate\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:46.836700 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.836690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:46.836984 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.836729 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-stats-auth\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:46.836984 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.836755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-grltq\" (UID: \"770ecfe6-c426-49c5-aa5c-da656e2dc3c3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" Apr 22 15:59:46.836984 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:46.836769 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 15:59:46.836984 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:46.836843 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 15:59:46.836984 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.836778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:46.836984 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:46.836849 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls podName:770ecfe6-c426-49c5-aa5c-da656e2dc3c3 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:47.336827587 +0000 UTC m=+64.629016290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-grltq" (UID: "770ecfe6-c426-49c5-aa5c-da656e2dc3c3") : secret "cluster-monitoring-operator-tls" not found Apr 22 15:59:46.836984 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:46.836932 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle podName:a0f34ab0-9da5-40dc-8537-84795ca7da5f nodeName:}" failed. No retries permitted until 2026-04-22 15:59:47.336915957 +0000 UTC m=+64.629104673 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle") pod "router-default-7dd4c648bd-28stq" (UID: "a0f34ab0-9da5-40dc-8537-84795ca7da5f") : configmap references non-existent config key: service-ca.crt Apr 22 15:59:46.836984 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:46.836950 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs podName:a0f34ab0-9da5-40dc-8537-84795ca7da5f nodeName:}" failed. No retries permitted until 2026-04-22 15:59:47.336939655 +0000 UTC m=+64.629128363 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs") pod "router-default-7dd4c648bd-28stq" (UID: "a0f34ab0-9da5-40dc-8537-84795ca7da5f") : secret "router-metrics-certs-default" not found Apr 22 15:59:46.836984 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.836968 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bl9w\" (UniqueName: \"kubernetes.io/projected/a0f34ab0-9da5-40dc-8537-84795ca7da5f-kube-api-access-7bl9w\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:46.837351 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.837011 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mzpd\" (UniqueName: \"kubernetes.io/projected/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-kube-api-access-2mzpd\") pod \"cluster-monitoring-operator-75587bd455-grltq\" (UID: \"770ecfe6-c426-49c5-aa5c-da656e2dc3c3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" Apr 22 15:59:46.837512 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.837486 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-grltq\" (UID: \"770ecfe6-c426-49c5-aa5c-da656e2dc3c3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" Apr 22 15:59:46.840436 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.840405 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-default-certificate\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:46.840436 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.840427 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-stats-auth\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:46.845624 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.845599 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mzpd\" (UniqueName: \"kubernetes.io/projected/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-kube-api-access-2mzpd\") pod \"cluster-monitoring-operator-75587bd455-grltq\" (UID: \"770ecfe6-c426-49c5-aa5c-da656e2dc3c3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" Apr 22 15:59:46.845799 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:46.845663 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bl9w\" (UniqueName: \"kubernetes.io/projected/a0f34ab0-9da5-40dc-8537-84795ca7da5f-kube-api-access-7bl9w\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:47.341670 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:47.341629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 15:59:47.341842 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:47.341681 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-grltq\" (UID: \"770ecfe6-c426-49c5-aa5c-da656e2dc3c3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" Apr 22 15:59:47.341842 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:47.341712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:47.341842 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:47.341745 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:47.341842 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:47.341801 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:47.342044 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:47.341846 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 15:59:47.342044 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:47.341852 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 15:59:47.342044 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:47.341898 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls podName:c54ba0ce-4b35-4707-8ee7-608cb358834b nodeName:}" failed. No retries permitted until 2026-04-22 16:00:19.341870812 +0000 UTC m=+96.634059524 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls") pod "dns-default-njwkw" (UID: "c54ba0ce-4b35-4707-8ee7-608cb358834b") : secret "dns-default-metrics-tls" not found Apr 22 15:59:47.342044 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:47.341929 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle podName:a0f34ab0-9da5-40dc-8537-84795ca7da5f nodeName:}" failed. No retries permitted until 2026-04-22 15:59:48.341911562 +0000 UTC m=+65.634100283 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle") pod "router-default-7dd4c648bd-28stq" (UID: "a0f34ab0-9da5-40dc-8537-84795ca7da5f") : configmap references non-existent config key: service-ca.crt Apr 22 15:59:47.342044 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:47.341955 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls podName:770ecfe6-c426-49c5-aa5c-da656e2dc3c3 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:48.34194003 +0000 UTC m=+65.634128944 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-grltq" (UID: "770ecfe6-c426-49c5-aa5c-da656e2dc3c3") : secret "cluster-monitoring-operator-tls" not found Apr 22 15:59:47.342044 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:47.342002 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs podName:a0f34ab0-9da5-40dc-8537-84795ca7da5f nodeName:}" failed. No retries permitted until 2026-04-22 15:59:48.341981604 +0000 UTC m=+65.634170319 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs") pod "router-default-7dd4c648bd-28stq" (UID: "a0f34ab0-9da5-40dc-8537-84795ca7da5f") : secret "router-metrics-certs-default" not found Apr 22 15:59:47.442872 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:47.442833 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert\") pod \"ingress-canary-57g78\" (UID: \"bb23ac61-dcc3-40eb-a485-4e58d6ea6d04\") " pod="openshift-ingress-canary/ingress-canary-57g78" Apr 22 15:59:47.443044 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:47.442985 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:47.443089 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:47.443052 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert podName:bb23ac61-dcc3-40eb-a485-4e58d6ea6d04 nodeName:}" failed. No retries permitted until 2026-04-22 16:00:19.443034495 +0000 UTC m=+96.735223214 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert") pod "ingress-canary-57g78" (UID: "bb23ac61-dcc3-40eb-a485-4e58d6ea6d04") : secret "canary-serving-cert" not found Apr 22 15:59:48.048002 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:48.047963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs\") pod \"network-metrics-daemon-5v2vn\" (UID: \"13a488e0-8f15-4fd1-8913-c002ea52d186\") " pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 15:59:48.050642 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:48.050619 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 15:59:48.058389 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:48.058360 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 15:59:48.058487 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:48.058466 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs podName:13a488e0-8f15-4fd1-8913-c002ea52d186 nodeName:}" failed. No retries permitted until 2026-04-22 16:00:52.05844956 +0000 UTC m=+129.350638263 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs") pod "network-metrics-daemon-5v2vn" (UID: "13a488e0-8f15-4fd1-8913-c002ea52d186") : secret "metrics-daemon-secret" not found Apr 22 15:59:48.148346 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:48.148302 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-484d7\" (UniqueName: \"kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7\") pod \"network-check-target-cxpcn\" (UID: \"e37369f9-fa77-49bc-b161-0e8777c7ef13\") " pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:59:48.150905 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:48.150878 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 15:59:48.160937 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:48.160897 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 15:59:48.172306 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:48.172268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-484d7\" (UniqueName: \"kubernetes.io/projected/e37369f9-fa77-49bc-b161-0e8777c7ef13-kube-api-access-484d7\") pod \"network-check-target-cxpcn\" (UID: \"e37369f9-fa77-49bc-b161-0e8777c7ef13\") " pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:59:48.208638 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:48.208601 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jgt28\"" Apr 22 15:59:48.216352 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:48.216318 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:59:48.349338 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:48.349306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:48.349544 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:48.349384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-grltq\" (UID: \"770ecfe6-c426-49c5-aa5c-da656e2dc3c3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" Apr 22 15:59:48.349544 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:48.349411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:48.349544 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:48.349465 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle podName:a0f34ab0-9da5-40dc-8537-84795ca7da5f nodeName:}" failed. No retries permitted until 2026-04-22 15:59:50.349446399 +0000 UTC m=+67.641635101 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle") pod "router-default-7dd4c648bd-28stq" (UID: "a0f34ab0-9da5-40dc-8537-84795ca7da5f") : configmap references non-existent config key: service-ca.crt Apr 22 15:59:48.349544 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:48.349503 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 15:59:48.349726 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:48.349552 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 15:59:48.349726 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:48.349568 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs podName:a0f34ab0-9da5-40dc-8537-84795ca7da5f nodeName:}" failed. No retries permitted until 2026-04-22 15:59:50.349554005 +0000 UTC m=+67.641742709 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs") pod "router-default-7dd4c648bd-28stq" (UID: "a0f34ab0-9da5-40dc-8537-84795ca7da5f") : secret "router-metrics-certs-default" not found Apr 22 15:59:48.349726 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:48.349606 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls podName:770ecfe6-c426-49c5-aa5c-da656e2dc3c3 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:50.349590622 +0000 UTC m=+67.641779332 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-grltq" (UID: "770ecfe6-c426-49c5-aa5c-da656e2dc3c3") : secret "cluster-monitoring-operator-tls" not found Apr 22 15:59:48.370493 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:48.370447 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cxpcn"] Apr 22 15:59:48.375022 ip-10-0-132-57 kubenswrapper[2576]: W0422 15:59:48.374975 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode37369f9_fa77_49bc_b161_0e8777c7ef13.slice/crio-357b6bde4c8cff17fef645fbbfc352d9936831f60ae02ce652b8349d701dce27 WatchSource:0}: Error finding container 357b6bde4c8cff17fef645fbbfc352d9936831f60ae02ce652b8349d701dce27: Status 404 returned error can't find the container with id 357b6bde4c8cff17fef645fbbfc352d9936831f60ae02ce652b8349d701dce27 Apr 22 15:59:48.781521 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:48.781481 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cxpcn" event={"ID":"e37369f9-fa77-49bc-b161-0e8777c7ef13","Type":"ContainerStarted","Data":"357b6bde4c8cff17fef645fbbfc352d9936831f60ae02ce652b8349d701dce27"} Apr 22 15:59:50.366337 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:50.366300 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-grltq\" (UID: \"770ecfe6-c426-49c5-aa5c-da656e2dc3c3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" Apr 22 15:59:50.366337 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:50.366351 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:50.366936 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:50.366386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:50.366936 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:50.366489 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 15:59:50.366936 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:50.366514 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 15:59:50.366936 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:50.366556 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle podName:a0f34ab0-9da5-40dc-8537-84795ca7da5f nodeName:}" failed. No retries permitted until 2026-04-22 15:59:54.366523993 +0000 UTC m=+71.658712707 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle") pod "router-default-7dd4c648bd-28stq" (UID: "a0f34ab0-9da5-40dc-8537-84795ca7da5f") : configmap references non-existent config key: service-ca.crt Apr 22 15:59:50.366936 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:50.366583 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs podName:a0f34ab0-9da5-40dc-8537-84795ca7da5f nodeName:}" failed. No retries permitted until 2026-04-22 15:59:54.366565549 +0000 UTC m=+71.658754253 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs") pod "router-default-7dd4c648bd-28stq" (UID: "a0f34ab0-9da5-40dc-8537-84795ca7da5f") : secret "router-metrics-certs-default" not found Apr 22 15:59:50.366936 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:50.366604 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls podName:770ecfe6-c426-49c5-aa5c-da656e2dc3c3 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:54.366597553 +0000 UTC m=+71.658786255 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-grltq" (UID: "770ecfe6-c426-49c5-aa5c-da656e2dc3c3") : secret "cluster-monitoring-operator-tls" not found Apr 22 15:59:50.622395 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:50.622297 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5"] Apr 22 15:59:50.628299 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:50.628268 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" Apr 22 15:59:50.631095 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:50.631067 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 15:59:50.632274 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:50.631656 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 15:59:50.632274 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:50.631782 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:59:50.632274 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:50.632101 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-xgb4r\"" Apr 22 15:59:50.632274 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:50.632223 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5"] Apr 22 15:59:50.669316 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:50.669265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-l9dd5\" (UID: \"4d7bba4e-af80-43e6-a122-3fa63ef8aff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" Apr 22 15:59:50.669522 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:50.669466 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmm75\" (UniqueName: \"kubernetes.io/projected/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-kube-api-access-hmm75\") pod \"cluster-samples-operator-6dc5bdb6b4-l9dd5\" (UID: \"4d7bba4e-af80-43e6-a122-3fa63ef8aff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" Apr 22 15:59:50.770065 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:50.770013 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmm75\" (UniqueName: \"kubernetes.io/projected/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-kube-api-access-hmm75\") pod \"cluster-samples-operator-6dc5bdb6b4-l9dd5\" (UID: \"4d7bba4e-af80-43e6-a122-3fa63ef8aff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" Apr 22 15:59:50.770244 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:50.770116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-l9dd5\" (UID: \"4d7bba4e-af80-43e6-a122-3fa63ef8aff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" Apr 22 15:59:50.770322 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:50.770250 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 15:59:50.770373 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:50.770325 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls podName:4d7bba4e-af80-43e6-a122-3fa63ef8aff1 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:51.270303538 +0000 UTC m=+68.562492248 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-l9dd5" (UID: "4d7bba4e-af80-43e6-a122-3fa63ef8aff1") : secret "samples-operator-tls" not found Apr 22 15:59:50.778918 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:50.778886 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmm75\" (UniqueName: \"kubernetes.io/projected/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-kube-api-access-hmm75\") pod \"cluster-samples-operator-6dc5bdb6b4-l9dd5\" (UID: \"4d7bba4e-af80-43e6-a122-3fa63ef8aff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" Apr 22 15:59:51.274641 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:51.274593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-l9dd5\" (UID: \"4d7bba4e-af80-43e6-a122-3fa63ef8aff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" Apr 22 15:59:51.274831 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:51.274751 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 15:59:51.274882 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:51.274839 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls podName:4d7bba4e-af80-43e6-a122-3fa63ef8aff1 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:52.274821028 +0000 UTC m=+69.567009731 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-l9dd5" (UID: "4d7bba4e-af80-43e6-a122-3fa63ef8aff1") : secret "samples-operator-tls" not found Apr 22 15:59:51.790071 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:51.790029 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cxpcn" event={"ID":"e37369f9-fa77-49bc-b161-0e8777c7ef13","Type":"ContainerStarted","Data":"6b02635c970b93d94d9d2f5bacec8c4252c52a5348b14c6c814397743a49cff7"} Apr 22 15:59:51.790483 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:51.790150 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 15:59:51.804433 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:51.804377 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-cxpcn" podStartSLOduration=66.057648653 podStartE2EDuration="1m8.804362128s" podCreationTimestamp="2026-04-22 15:58:43 +0000 UTC" firstStartedPulling="2026-04-22 15:59:48.377135637 +0000 UTC m=+65.669324339" lastFinishedPulling="2026-04-22 15:59:51.123849112 +0000 UTC m=+68.416037814" observedRunningTime="2026-04-22 15:59:51.803901831 +0000 UTC m=+69.096090565" watchObservedRunningTime="2026-04-22 15:59:51.804362128 +0000 UTC m=+69.096550851" Apr 22 15:59:52.283328 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:52.283274 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-l9dd5\" (UID: \"4d7bba4e-af80-43e6-a122-3fa63ef8aff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" Apr 22 15:59:52.283571 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:52.283440 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 15:59:52.283571 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:52.283517 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls podName:4d7bba4e-af80-43e6-a122-3fa63ef8aff1 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:54.283497944 +0000 UTC m=+71.575686645 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-l9dd5" (UID: "4d7bba4e-af80-43e6-a122-3fa63ef8aff1") : secret "samples-operator-tls" not found Apr 22 15:59:54.216749 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:54.216719 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nq959_0f691780-d61d-4734-a36c-01d15ac43908/dns-node-resolver/0.log" Apr 22 15:59:54.299872 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:54.299837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-l9dd5\" (UID: \"4d7bba4e-af80-43e6-a122-3fa63ef8aff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" Apr 22 15:59:54.300004 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:54.299983 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 15:59:54.300064 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:54.300054 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls podName:4d7bba4e-af80-43e6-a122-3fa63ef8aff1 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:58.300038289 +0000 UTC m=+75.592226991 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-l9dd5" (UID: "4d7bba4e-af80-43e6-a122-3fa63ef8aff1") : secret "samples-operator-tls" not found Apr 22 15:59:54.400251 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:54.400209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:54.400376 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:54.400298 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-grltq\" (UID: \"770ecfe6-c426-49c5-aa5c-da656e2dc3c3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" Apr 22 15:59:54.400376 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:54.400330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 15:59:54.400459 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:54.400378 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle podName:a0f34ab0-9da5-40dc-8537-84795ca7da5f nodeName:}" failed. No retries permitted until 2026-04-22 16:00:02.400360444 +0000 UTC m=+79.692549146 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle") pod "router-default-7dd4c648bd-28stq" (UID: "a0f34ab0-9da5-40dc-8537-84795ca7da5f") : configmap references non-existent config key: service-ca.crt Apr 22 15:59:54.400459 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:54.400437 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 15:59:54.400459 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:54.400437 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 15:59:54.400642 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:54.400495 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls podName:770ecfe6-c426-49c5-aa5c-da656e2dc3c3 nodeName:}" failed. No retries permitted until 2026-04-22 16:00:02.400480602 +0000 UTC m=+79.692669305 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-grltq" (UID: "770ecfe6-c426-49c5-aa5c-da656e2dc3c3") : secret "cluster-monitoring-operator-tls" not found Apr 22 15:59:54.400642 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:54.400512 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs podName:a0f34ab0-9da5-40dc-8537-84795ca7da5f nodeName:}" failed. No retries permitted until 2026-04-22 16:00:02.400504001 +0000 UTC m=+79.692692703 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs") pod "router-default-7dd4c648bd-28stq" (UID: "a0f34ab0-9da5-40dc-8537-84795ca7da5f") : secret "router-metrics-certs-default" not found Apr 22 15:59:54.616622 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:54.616517 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4gt4d_284f7a34-e743-4f00-9226-bfcbfbabe4a4/node-ca/0.log" Apr 22 15:59:58.332322 ip-10-0-132-57 kubenswrapper[2576]: I0422 15:59:58.332278 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-l9dd5\" (UID: \"4d7bba4e-af80-43e6-a122-3fa63ef8aff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" Apr 22 15:59:58.332755 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:58.332441 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 15:59:58.332755 ip-10-0-132-57 kubenswrapper[2576]: E0422 15:59:58.332544 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls podName:4d7bba4e-af80-43e6-a122-3fa63ef8aff1 nodeName:}" failed. No retries permitted until 2026-04-22 16:00:06.332504237 +0000 UTC m=+83.624692941 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-l9dd5" (UID: "4d7bba4e-af80-43e6-a122-3fa63ef8aff1") : secret "samples-operator-tls" not found Apr 22 16:00:02.467129 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:02.466979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-grltq\" (UID: \"770ecfe6-c426-49c5-aa5c-da656e2dc3c3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" Apr 22 16:00:02.467129 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:02.467044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 16:00:02.467129 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:02.467078 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 16:00:02.467638 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:00:02.467140 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 16:00:02.467638 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:00:02.467212 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle podName:a0f34ab0-9da5-40dc-8537-84795ca7da5f nodeName:}" failed. No retries permitted until 2026-04-22 16:00:18.467195589 +0000 UTC m=+95.759384290 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle") pod "router-default-7dd4c648bd-28stq" (UID: "a0f34ab0-9da5-40dc-8537-84795ca7da5f") : configmap references non-existent config key: service-ca.crt Apr 22 16:00:02.467638 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:00:02.467150 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 16:00:02.467638 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:00:02.467239 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls podName:770ecfe6-c426-49c5-aa5c-da656e2dc3c3 nodeName:}" failed. No retries permitted until 2026-04-22 16:00:18.467220911 +0000 UTC m=+95.759409630 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-grltq" (UID: "770ecfe6-c426-49c5-aa5c-da656e2dc3c3") : secret "cluster-monitoring-operator-tls" not found Apr 22 16:00:02.467638 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:00:02.467280 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs podName:a0f34ab0-9da5-40dc-8537-84795ca7da5f nodeName:}" failed. No retries permitted until 2026-04-22 16:00:18.467268771 +0000 UTC m=+95.759457477 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs") pod "router-default-7dd4c648bd-28stq" (UID: "a0f34ab0-9da5-40dc-8537-84795ca7da5f") : secret "router-metrics-certs-default" not found Apr 22 16:00:02.890509 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:02.890424 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nqjb7"] Apr 22 16:00:02.893305 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:02.893287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-nqjb7" Apr 22 16:00:02.895647 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:02.895622 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 16:00:02.895764 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:02.895697 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 16:00:02.896588 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:02.896572 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-n624b\"" Apr 22 16:00:02.896653 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:02.896605 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 16:00:02.896653 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:02.896641 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 16:00:02.901404 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:02.901370 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nqjb7"] Apr 22 16:00:02.971750 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:02.971712 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kh62\" (UniqueName: \"kubernetes.io/projected/e4290df6-986b-42c0-8325-0bbf750ee1bf-kube-api-access-8kh62\") pod \"service-ca-865cb79987-nqjb7\" (UID: \"e4290df6-986b-42c0-8325-0bbf750ee1bf\") " pod="openshift-service-ca/service-ca-865cb79987-nqjb7" Apr 22 16:00:02.971949 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:02.971787 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e4290df6-986b-42c0-8325-0bbf750ee1bf-signing-key\") pod \"service-ca-865cb79987-nqjb7\" (UID: \"e4290df6-986b-42c0-8325-0bbf750ee1bf\") " pod="openshift-service-ca/service-ca-865cb79987-nqjb7" Apr 22 16:00:02.971949 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:02.971845 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e4290df6-986b-42c0-8325-0bbf750ee1bf-signing-cabundle\") pod \"service-ca-865cb79987-nqjb7\" (UID: \"e4290df6-986b-42c0-8325-0bbf750ee1bf\") " pod="openshift-service-ca/service-ca-865cb79987-nqjb7" Apr 22 16:00:03.073120 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:03.073066 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e4290df6-986b-42c0-8325-0bbf750ee1bf-signing-key\") pod \"service-ca-865cb79987-nqjb7\" (UID: \"e4290df6-986b-42c0-8325-0bbf750ee1bf\") " pod="openshift-service-ca/service-ca-865cb79987-nqjb7" Apr 22 16:00:03.073252 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:03.073147 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e4290df6-986b-42c0-8325-0bbf750ee1bf-signing-cabundle\") pod \"service-ca-865cb79987-nqjb7\" (UID: \"e4290df6-986b-42c0-8325-0bbf750ee1bf\") " pod="openshift-service-ca/service-ca-865cb79987-nqjb7" Apr 22 16:00:03.073329 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:03.073247 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kh62\" (UniqueName: \"kubernetes.io/projected/e4290df6-986b-42c0-8325-0bbf750ee1bf-kube-api-access-8kh62\") pod \"service-ca-865cb79987-nqjb7\" (UID: \"e4290df6-986b-42c0-8325-0bbf750ee1bf\") " pod="openshift-service-ca/service-ca-865cb79987-nqjb7" Apr 22 16:00:03.073926 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:03.073907 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e4290df6-986b-42c0-8325-0bbf750ee1bf-signing-cabundle\") pod \"service-ca-865cb79987-nqjb7\" (UID: \"e4290df6-986b-42c0-8325-0bbf750ee1bf\") " pod="openshift-service-ca/service-ca-865cb79987-nqjb7" Apr 22 16:00:03.075744 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:03.075725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e4290df6-986b-42c0-8325-0bbf750ee1bf-signing-key\") pod \"service-ca-865cb79987-nqjb7\" (UID: \"e4290df6-986b-42c0-8325-0bbf750ee1bf\") " pod="openshift-service-ca/service-ca-865cb79987-nqjb7" Apr 22 16:00:03.081853 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:03.081828 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kh62\" (UniqueName: \"kubernetes.io/projected/e4290df6-986b-42c0-8325-0bbf750ee1bf-kube-api-access-8kh62\") pod \"service-ca-865cb79987-nqjb7\" (UID: \"e4290df6-986b-42c0-8325-0bbf750ee1bf\") " pod="openshift-service-ca/service-ca-865cb79987-nqjb7" Apr 22 16:00:03.204198 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:03.204149 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-nqjb7" Apr 22 16:00:03.325910 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:03.325877 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nqjb7"] Apr 22 16:00:03.329625 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:00:03.329578 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4290df6_986b_42c0_8325_0bbf750ee1bf.slice/crio-c506073d1b0dd15859cc33eb5476ba6df92d83db9dd8ecfd00826a89bb360c41 WatchSource:0}: Error finding container c506073d1b0dd15859cc33eb5476ba6df92d83db9dd8ecfd00826a89bb360c41: Status 404 returned error can't find the container with id c506073d1b0dd15859cc33eb5476ba6df92d83db9dd8ecfd00826a89bb360c41 Apr 22 16:00:03.818380 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:03.818343 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-nqjb7" event={"ID":"e4290df6-986b-42c0-8325-0bbf750ee1bf","Type":"ContainerStarted","Data":"c506073d1b0dd15859cc33eb5476ba6df92d83db9dd8ecfd00826a89bb360c41"} Apr 22 16:00:05.824260 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:05.824219 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-nqjb7" event={"ID":"e4290df6-986b-42c0-8325-0bbf750ee1bf","Type":"ContainerStarted","Data":"010589312aabc2da736c4e3b3eb3b838639e4df42e8e2d5cbd1d30239d1aec8a"} Apr 22 16:00:05.839391 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:05.839342 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-nqjb7" podStartSLOduration=1.8326272270000001 podStartE2EDuration="3.839324134s" podCreationTimestamp="2026-04-22 16:00:02 +0000 UTC" firstStartedPulling="2026-04-22 16:00:03.331789647 +0000 UTC m=+80.623978364" lastFinishedPulling="2026-04-22 16:00:05.338486567 +0000 UTC m=+82.630675271" observedRunningTime="2026-04-22 16:00:05.83824397 +0000 UTC m=+83.130432693" watchObservedRunningTime="2026-04-22 16:00:05.839324134 +0000 UTC m=+83.131512859" Apr 22 16:00:06.404177 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:06.404127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-l9dd5\" (UID: \"4d7bba4e-af80-43e6-a122-3fa63ef8aff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" Apr 22 16:00:06.404401 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:00:06.404277 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 16:00:06.404401 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:00:06.404350 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls podName:4d7bba4e-af80-43e6-a122-3fa63ef8aff1 nodeName:}" failed. No retries permitted until 2026-04-22 16:00:22.404330979 +0000 UTC m=+99.696519681 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-l9dd5" (UID: "4d7bba4e-af80-43e6-a122-3fa63ef8aff1") : secret "samples-operator-tls" not found Apr 22 16:00:18.504890 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:18.504844 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-grltq\" (UID: \"770ecfe6-c426-49c5-aa5c-da656e2dc3c3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" Apr 22 16:00:18.504890 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:18.504902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 16:00:18.505556 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:18.505040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 16:00:18.506297 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:18.506269 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f34ab0-9da5-40dc-8537-84795ca7da5f-service-ca-bundle\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 16:00:18.507616 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:18.507588 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0f34ab0-9da5-40dc-8537-84795ca7da5f-metrics-certs\") pod \"router-default-7dd4c648bd-28stq\" (UID: \"a0f34ab0-9da5-40dc-8537-84795ca7da5f\") " pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 16:00:18.507753 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:18.507731 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/770ecfe6-c426-49c5-aa5c-da656e2dc3c3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-grltq\" (UID: \"770ecfe6-c426-49c5-aa5c-da656e2dc3c3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" Apr 22 16:00:18.698634 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:18.698569 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" Apr 22 16:00:18.705570 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:18.705504 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 16:00:18.833573 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:18.833517 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq"] Apr 22 16:00:18.836857 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:00:18.836821 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod770ecfe6_c426_49c5_aa5c_da656e2dc3c3.slice/crio-7c7e297aba36fbbb09dffa6a31d0c9a17d81a5fc638d239d60929ace6fdaf39a WatchSource:0}: Error finding container 7c7e297aba36fbbb09dffa6a31d0c9a17d81a5fc638d239d60929ace6fdaf39a: Status 404 returned error can't find the container with id 7c7e297aba36fbbb09dffa6a31d0c9a17d81a5fc638d239d60929ace6fdaf39a Apr 22 16:00:18.849910 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:18.849869 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" event={"ID":"770ecfe6-c426-49c5-aa5c-da656e2dc3c3","Type":"ContainerStarted","Data":"7c7e297aba36fbbb09dffa6a31d0c9a17d81a5fc638d239d60929ace6fdaf39a"} Apr 22 16:00:18.851421 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:18.851389 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7dd4c648bd-28stq"] Apr 22 16:00:18.855368 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:00:18.855326 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0f34ab0_9da5_40dc_8537_84795ca7da5f.slice/crio-2f27756c5931c173edfbc9dc4c7f10073841bbde051a5bf3b2e44cd00cdb7206 WatchSource:0}: Error finding container 2f27756c5931c173edfbc9dc4c7f10073841bbde051a5bf3b2e44cd00cdb7206: Status 404 returned error can't find the container with id 2f27756c5931c173edfbc9dc4c7f10073841bbde051a5bf3b2e44cd00cdb7206 Apr 22 16:00:19.412735 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:19.412690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 16:00:19.415357 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:19.415331 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c54ba0ce-4b35-4707-8ee7-608cb358834b-metrics-tls\") pod \"dns-default-njwkw\" (UID: \"c54ba0ce-4b35-4707-8ee7-608cb358834b\") " pod="openshift-dns/dns-default-njwkw" Apr 22 16:00:19.448650 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:19.448615 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lm8hj\"" Apr 22 16:00:19.456621 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:19.456582 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-njwkw" Apr 22 16:00:19.513829 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:19.513795 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert\") pod \"ingress-canary-57g78\" (UID: \"bb23ac61-dcc3-40eb-a485-4e58d6ea6d04\") " pod="openshift-ingress-canary/ingress-canary-57g78" Apr 22 16:00:19.523078 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:19.523042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb23ac61-dcc3-40eb-a485-4e58d6ea6d04-cert\") pod \"ingress-canary-57g78\" (UID: \"bb23ac61-dcc3-40eb-a485-4e58d6ea6d04\") " pod="openshift-ingress-canary/ingress-canary-57g78" Apr 22 16:00:19.599471 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:19.599434 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-njwkw"] Apr 22 16:00:19.602977 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:00:19.602939 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc54ba0ce_4b35_4707_8ee7_608cb358834b.slice/crio-b79f6e52a4e07b0c5dbcee7209246f24933918a01d55fd45f8a30a6955135574 WatchSource:0}: Error finding container b79f6e52a4e07b0c5dbcee7209246f24933918a01d55fd45f8a30a6955135574: Status 404 returned error can't find the container with id b79f6e52a4e07b0c5dbcee7209246f24933918a01d55fd45f8a30a6955135574 Apr 22 16:00:19.772158 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:19.772125 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbbp8\"" Apr 22 16:00:19.780812 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:19.780775 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-57g78" Apr 22 16:00:19.855920 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:19.855887 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-njwkw" event={"ID":"c54ba0ce-4b35-4707-8ee7-608cb358834b","Type":"ContainerStarted","Data":"b79f6e52a4e07b0c5dbcee7209246f24933918a01d55fd45f8a30a6955135574"} Apr 22 16:00:19.857716 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:19.857673 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dd4c648bd-28stq" event={"ID":"a0f34ab0-9da5-40dc-8537-84795ca7da5f","Type":"ContainerStarted","Data":"eb9dda9e0fea59f1efa8d7c1f624207dc75730925b2eb2e0a310d3d236336a62"} Apr 22 16:00:19.857716 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:19.857719 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dd4c648bd-28stq" event={"ID":"a0f34ab0-9da5-40dc-8537-84795ca7da5f","Type":"ContainerStarted","Data":"2f27756c5931c173edfbc9dc4c7f10073841bbde051a5bf3b2e44cd00cdb7206"} Apr 22 16:00:19.874904 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:19.874851 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7dd4c648bd-28stq" podStartSLOduration=33.874834668 podStartE2EDuration="33.874834668s" podCreationTimestamp="2026-04-22 15:59:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:00:19.874804609 +0000 UTC m=+97.166993336" watchObservedRunningTime="2026-04-22 16:00:19.874834668 +0000 UTC m=+97.167023421" Apr 22 16:00:19.925233 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:19.925198 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-57g78"] Apr 22 16:00:19.928409 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:00:19.928368 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb23ac61_dcc3_40eb_a485_4e58d6ea6d04.slice/crio-9ad0d9315cff2756549d37342ad1683fcc0de17786965e5d6f5ac396268720b9 WatchSource:0}: Error finding container 9ad0d9315cff2756549d37342ad1683fcc0de17786965e5d6f5ac396268720b9: Status 404 returned error can't find the container with id 9ad0d9315cff2756549d37342ad1683fcc0de17786965e5d6f5ac396268720b9 Apr 22 16:00:20.706264 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:20.706228 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 16:00:20.709564 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:20.709381 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 16:00:20.862715 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:20.862667 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-57g78" event={"ID":"bb23ac61-dcc3-40eb-a485-4e58d6ea6d04","Type":"ContainerStarted","Data":"9ad0d9315cff2756549d37342ad1683fcc0de17786965e5d6f5ac396268720b9"} Apr 22 16:00:20.862715 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:20.862722 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 16:00:20.863795 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:20.863772 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7dd4c648bd-28stq" Apr 22 16:00:22.439142 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:22.439071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-l9dd5\" (UID: \"4d7bba4e-af80-43e6-a122-3fa63ef8aff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" Apr 22 16:00:22.442357 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:22.442316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d7bba4e-af80-43e6-a122-3fa63ef8aff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-l9dd5\" (UID: \"4d7bba4e-af80-43e6-a122-3fa63ef8aff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" Apr 22 16:00:22.739708 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:22.739658 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" Apr 22 16:00:22.795938 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:22.795292 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cxpcn" Apr 22 16:00:22.868181 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:22.868141 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-njwkw" event={"ID":"c54ba0ce-4b35-4707-8ee7-608cb358834b","Type":"ContainerStarted","Data":"c929c3e6a4b043d5ec27d18fba2bfa8f32513f11c6ee44862401f42401e598ae"} Apr 22 16:00:22.868181 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:22.868188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-njwkw" event={"ID":"c54ba0ce-4b35-4707-8ee7-608cb358834b","Type":"ContainerStarted","Data":"5f93beff5ab06f27aa2eddb232f17e5b8fb9ea5aae2d1278b85357bce5da94e6"} Apr 22 16:00:22.868451 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:22.868275 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-njwkw" Apr 22 16:00:22.870067 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:22.870016 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5"] Apr 22 16:00:22.870337 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:22.870307 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" event={"ID":"770ecfe6-c426-49c5-aa5c-da656e2dc3c3","Type":"ContainerStarted","Data":"9641d8463459344f475552813c93520658252d27ec10aa06d4db0bff0c5271a6"} Apr 22 16:00:22.871844 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:22.871822 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-57g78" event={"ID":"bb23ac61-dcc3-40eb-a485-4e58d6ea6d04","Type":"ContainerStarted","Data":"d0e62c615e9d96e1d7c5610bcbf331ce93226a7644fce80148336c8551113f4c"} Apr 22 16:00:22.890007 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:22.889952 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-njwkw" podStartSLOduration=65.341742905 podStartE2EDuration="1m7.889932991s" podCreationTimestamp="2026-04-22 15:59:15 +0000 UTC" firstStartedPulling="2026-04-22 16:00:19.605303011 +0000 UTC m=+96.897491720" lastFinishedPulling="2026-04-22 16:00:22.153493103 +0000 UTC m=+99.445681806" observedRunningTime="2026-04-22 16:00:22.889465335 +0000 UTC m=+100.181654060" watchObservedRunningTime="2026-04-22 16:00:22.889932991 +0000 UTC m=+100.182121717" Apr 22 16:00:22.912955 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:22.912871 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-grltq" podStartSLOduration=33.599160761 podStartE2EDuration="36.912850257s" podCreationTimestamp="2026-04-22 15:59:46 +0000 UTC" firstStartedPulling="2026-04-22 16:00:18.838600592 +0000 UTC m=+96.130789294" lastFinishedPulling="2026-04-22 16:00:22.152290074 +0000 UTC m=+99.444478790" observedRunningTime="2026-04-22 16:00:22.911315567 +0000 UTC m=+100.203504307" watchObservedRunningTime="2026-04-22 16:00:22.912850257 +0000 UTC m=+100.205038982" Apr 22 16:00:22.927427 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:22.927371 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-57g78" podStartSLOduration=65.701015644 podStartE2EDuration="1m7.927349322s" podCreationTimestamp="2026-04-22 15:59:15 +0000 UTC" firstStartedPulling="2026-04-22 16:00:19.930365087 +0000 UTC m=+97.222553789" lastFinishedPulling="2026-04-22 16:00:22.156698765 +0000 UTC m=+99.448887467" observedRunningTime="2026-04-22 16:00:22.926387375 +0000 UTC m=+100.218576089" watchObservedRunningTime="2026-04-22 16:00:22.927349322 +0000 UTC m=+100.219538043" Apr 22 16:00:23.876725 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:23.876679 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" event={"ID":"4d7bba4e-af80-43e6-a122-3fa63ef8aff1","Type":"ContainerStarted","Data":"fc0258146f36efcadc9bac62a9e9becabf63fad65f05ffb8614666b2b8c53f39"} Apr 22 16:00:24.880986 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:24.880939 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" event={"ID":"4d7bba4e-af80-43e6-a122-3fa63ef8aff1","Type":"ContainerStarted","Data":"34e70e5d51e4e26bbbd47b42a32ef919ac86e5e5ce96c9df088f77cae4f7fea8"} Apr 22 16:00:24.880986 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:24.880977 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" event={"ID":"4d7bba4e-af80-43e6-a122-3fa63ef8aff1","Type":"ContainerStarted","Data":"5cad2be1f31572f25b60f696f3f5c378ed5385b6108ae6267dc50b55ee8e97fd"} Apr 22 16:00:24.899575 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:24.899507 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-l9dd5" podStartSLOduration=33.146064111 podStartE2EDuration="34.899488737s" podCreationTimestamp="2026-04-22 15:59:50 +0000 UTC" firstStartedPulling="2026-04-22 16:00:22.912274979 +0000 UTC m=+100.204463689" lastFinishedPulling="2026-04-22 16:00:24.665699613 +0000 UTC m=+101.957888315" observedRunningTime="2026-04-22 16:00:24.898734715 +0000 UTC m=+102.190923438" watchObservedRunningTime="2026-04-22 16:00:24.899488737 +0000 UTC m=+102.191677460" Apr 22 16:00:25.422712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.422676 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp"] Apr 22 16:00:25.425642 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.425622 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp" Apr 22 16:00:25.427750 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.427716 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 16:00:25.427859 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.427767 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 16:00:25.428467 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.428448 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 16:00:25.428505 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.428475 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 16:00:25.434982 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.434953 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp"] Apr 22 16:00:25.506598 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.506563 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8956b97cd-qm8h9"] Apr 22 16:00:25.509896 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.509756 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.512439 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.512413 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 16:00:25.512737 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.512457 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 16:00:25.512737 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.512605 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-4mqmh\"" Apr 22 16:00:25.512905 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.512850 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 16:00:25.519432 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.519407 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 16:00:25.522622 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.522593 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8956b97cd-qm8h9"] Apr 22 16:00:25.537862 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.537831 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-r6v78"] Apr 22 16:00:25.540944 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.540914 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-r6v78" Apr 22 16:00:25.543173 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.543150 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-lz7ck\"" Apr 22 16:00:25.543312 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.543173 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 16:00:25.543372 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.543336 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 16:00:25.543372 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.543367 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 16:00:25.543576 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.543559 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 16:00:25.555835 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.555806 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-r6v78"] Apr 22 16:00:25.564792 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.564758 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34d9f048-671f-4c2d-bb28-faacd1ede66c-tmp\") pod \"klusterlet-addon-workmgr-6d9d754dc9-fm5cp\" (UID: \"34d9f048-671f-4c2d-bb28-faacd1ede66c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp" Apr 22 16:00:25.564943 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.564804 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/34d9f048-671f-4c2d-bb28-faacd1ede66c-klusterlet-config\") pod \"klusterlet-addon-workmgr-6d9d754dc9-fm5cp\" (UID: \"34d9f048-671f-4c2d-bb28-faacd1ede66c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp" Apr 22 16:00:25.564943 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.564909 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzvq2\" (UniqueName: \"kubernetes.io/projected/34d9f048-671f-4c2d-bb28-faacd1ede66c-kube-api-access-fzvq2\") pod \"klusterlet-addon-workmgr-6d9d754dc9-fm5cp\" (UID: \"34d9f048-671f-4c2d-bb28-faacd1ede66c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp" Apr 22 16:00:25.666007 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.665965 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c4c85ad4-4658-4c87-9205-179acf53b17d-image-registry-private-configuration\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.666185 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.666017 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c4c85ad4-4658-4c87-9205-179acf53b17d-registry-tls\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.666185 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.666122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34d9f048-671f-4c2d-bb28-faacd1ede66c-tmp\") pod \"klusterlet-addon-workmgr-6d9d754dc9-fm5cp\" (UID: \"34d9f048-671f-4c2d-bb28-faacd1ede66c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp" Apr 22 16:00:25.666185 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.666158 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/27117f2e-c258-4f10-8c53-d2baf49a8e53-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-r6v78\" (UID: \"27117f2e-c258-4f10-8c53-d2baf49a8e53\") " pod="openshift-insights/insights-runtime-extractor-r6v78" Apr 22 16:00:25.666185 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.666179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rjng\" (UniqueName: \"kubernetes.io/projected/27117f2e-c258-4f10-8c53-d2baf49a8e53-kube-api-access-6rjng\") pod \"insights-runtime-extractor-r6v78\" (UID: \"27117f2e-c258-4f10-8c53-d2baf49a8e53\") " pod="openshift-insights/insights-runtime-extractor-r6v78" Apr 22 16:00:25.666389 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.666197 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/27117f2e-c258-4f10-8c53-d2baf49a8e53-data-volume\") pod \"insights-runtime-extractor-r6v78\" (UID: \"27117f2e-c258-4f10-8c53-d2baf49a8e53\") " pod="openshift-insights/insights-runtime-extractor-r6v78" Apr 22 16:00:25.666389 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.666214 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c4c85ad4-4658-4c87-9205-179acf53b17d-ca-trust-extracted\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.666389 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.666242 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/27117f2e-c258-4f10-8c53-d2baf49a8e53-crio-socket\") pod \"insights-runtime-extractor-r6v78\" (UID: \"27117f2e-c258-4f10-8c53-d2baf49a8e53\") " pod="openshift-insights/insights-runtime-extractor-r6v78" Apr 22 16:00:25.666389 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.666260 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/34d9f048-671f-4c2d-bb28-faacd1ede66c-klusterlet-config\") pod \"klusterlet-addon-workmgr-6d9d754dc9-fm5cp\" (UID: \"34d9f048-671f-4c2d-bb28-faacd1ede66c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp" Apr 22 16:00:25.666389 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.666277 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c4c85ad4-4658-4c87-9205-179acf53b17d-installation-pull-secrets\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.666389 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.666303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c4c85ad4-4658-4c87-9205-179acf53b17d-registry-certificates\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.666389 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.666323 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzvq2\" (UniqueName: \"kubernetes.io/projected/34d9f048-671f-4c2d-bb28-faacd1ede66c-kube-api-access-fzvq2\") pod \"klusterlet-addon-workmgr-6d9d754dc9-fm5cp\" (UID: \"34d9f048-671f-4c2d-bb28-faacd1ede66c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp" Apr 22 16:00:25.666389 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.666376 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4c85ad4-4658-4c87-9205-179acf53b17d-trusted-ca\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.666719 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.666484 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cb99\" (UniqueName: \"kubernetes.io/projected/c4c85ad4-4658-4c87-9205-179acf53b17d-kube-api-access-9cb99\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.666719 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.666518 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c4c85ad4-4658-4c87-9205-179acf53b17d-bound-sa-token\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.666719 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.666576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/27117f2e-c258-4f10-8c53-d2baf49a8e53-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-r6v78\" (UID: \"27117f2e-c258-4f10-8c53-d2baf49a8e53\") " pod="openshift-insights/insights-runtime-extractor-r6v78" Apr 22 16:00:25.666719 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.666579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34d9f048-671f-4c2d-bb28-faacd1ede66c-tmp\") pod \"klusterlet-addon-workmgr-6d9d754dc9-fm5cp\" (UID: \"34d9f048-671f-4c2d-bb28-faacd1ede66c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp" Apr 22 16:00:25.669042 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.669019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/34d9f048-671f-4c2d-bb28-faacd1ede66c-klusterlet-config\") pod \"klusterlet-addon-workmgr-6d9d754dc9-fm5cp\" (UID: \"34d9f048-671f-4c2d-bb28-faacd1ede66c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp" Apr 22 16:00:25.674707 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.674642 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzvq2\" (UniqueName: \"kubernetes.io/projected/34d9f048-671f-4c2d-bb28-faacd1ede66c-kube-api-access-fzvq2\") pod \"klusterlet-addon-workmgr-6d9d754dc9-fm5cp\" (UID: \"34d9f048-671f-4c2d-bb28-faacd1ede66c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp" Apr 22 16:00:25.735588 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.735513 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp" Apr 22 16:00:25.767593 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.767557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cb99\" (UniqueName: \"kubernetes.io/projected/c4c85ad4-4658-4c87-9205-179acf53b17d-kube-api-access-9cb99\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.767800 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.767605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c4c85ad4-4658-4c87-9205-179acf53b17d-bound-sa-token\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.767800 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.767628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/27117f2e-c258-4f10-8c53-d2baf49a8e53-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-r6v78\" (UID: \"27117f2e-c258-4f10-8c53-d2baf49a8e53\") " pod="openshift-insights/insights-runtime-extractor-r6v78" Apr 22 16:00:25.767800 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.767665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c4c85ad4-4658-4c87-9205-179acf53b17d-image-registry-private-configuration\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.767800 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.767686 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c4c85ad4-4658-4c87-9205-179acf53b17d-registry-tls\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.768022 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.767876 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/27117f2e-c258-4f10-8c53-d2baf49a8e53-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-r6v78\" (UID: \"27117f2e-c258-4f10-8c53-d2baf49a8e53\") " pod="openshift-insights/insights-runtime-extractor-r6v78" Apr 22 16:00:25.768022 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.767921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rjng\" (UniqueName: \"kubernetes.io/projected/27117f2e-c258-4f10-8c53-d2baf49a8e53-kube-api-access-6rjng\") pod \"insights-runtime-extractor-r6v78\" (UID: \"27117f2e-c258-4f10-8c53-d2baf49a8e53\") " pod="openshift-insights/insights-runtime-extractor-r6v78" Apr 22 16:00:25.768022 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.767956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/27117f2e-c258-4f10-8c53-d2baf49a8e53-data-volume\") pod \"insights-runtime-extractor-r6v78\" (UID: \"27117f2e-c258-4f10-8c53-d2baf49a8e53\") " pod="openshift-insights/insights-runtime-extractor-r6v78" Apr 22 16:00:25.768022 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.767982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c4c85ad4-4658-4c87-9205-179acf53b17d-ca-trust-extracted\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.768022 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.768017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/27117f2e-c258-4f10-8c53-d2baf49a8e53-crio-socket\") pod \"insights-runtime-extractor-r6v78\" (UID: \"27117f2e-c258-4f10-8c53-d2baf49a8e53\") " pod="openshift-insights/insights-runtime-extractor-r6v78" Apr 22 16:00:25.768249 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.768049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c4c85ad4-4658-4c87-9205-179acf53b17d-installation-pull-secrets\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.768249 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.768093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c4c85ad4-4658-4c87-9205-179acf53b17d-registry-certificates\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.768249 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.768121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4c85ad4-4658-4c87-9205-179acf53b17d-trusted-ca\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.768431 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.768397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/27117f2e-c258-4f10-8c53-d2baf49a8e53-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-r6v78\" (UID: \"27117f2e-c258-4f10-8c53-d2baf49a8e53\") " pod="openshift-insights/insights-runtime-extractor-r6v78" Apr 22 16:00:25.768494 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.768431 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/27117f2e-c258-4f10-8c53-d2baf49a8e53-crio-socket\") pod \"insights-runtime-extractor-r6v78\" (UID: \"27117f2e-c258-4f10-8c53-d2baf49a8e53\") " pod="openshift-insights/insights-runtime-extractor-r6v78" Apr 22 16:00:25.769166 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.769124 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c4c85ad4-4658-4c87-9205-179acf53b17d-ca-trust-extracted\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.769573 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.769523 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/27117f2e-c258-4f10-8c53-d2baf49a8e53-data-volume\") pod \"insights-runtime-extractor-r6v78\" (UID: \"27117f2e-c258-4f10-8c53-d2baf49a8e53\") " pod="openshift-insights/insights-runtime-extractor-r6v78" Apr 22 16:00:25.769823 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.769802 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4c85ad4-4658-4c87-9205-179acf53b17d-trusted-ca\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.769908 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.769803 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c4c85ad4-4658-4c87-9205-179acf53b17d-registry-certificates\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.770845 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.770825 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/27117f2e-c258-4f10-8c53-d2baf49a8e53-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-r6v78\" (UID: \"27117f2e-c258-4f10-8c53-d2baf49a8e53\") " pod="openshift-insights/insights-runtime-extractor-r6v78" Apr 22 16:00:25.771370 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.771347 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c4c85ad4-4658-4c87-9205-179acf53b17d-image-registry-private-configuration\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.771562 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.771517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c4c85ad4-4658-4c87-9205-179acf53b17d-registry-tls\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.771619 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.771567 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c4c85ad4-4658-4c87-9205-179acf53b17d-installation-pull-secrets\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.776851 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.776777 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rjng\" (UniqueName: \"kubernetes.io/projected/27117f2e-c258-4f10-8c53-d2baf49a8e53-kube-api-access-6rjng\") pod \"insights-runtime-extractor-r6v78\" (UID: \"27117f2e-c258-4f10-8c53-d2baf49a8e53\") " pod="openshift-insights/insights-runtime-extractor-r6v78" Apr 22 16:00:25.777479 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.777051 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c4c85ad4-4658-4c87-9205-179acf53b17d-bound-sa-token\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.777479 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.777200 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cb99\" (UniqueName: \"kubernetes.io/projected/c4c85ad4-4658-4c87-9205-179acf53b17d-kube-api-access-9cb99\") pod \"image-registry-8956b97cd-qm8h9\" (UID: \"c4c85ad4-4658-4c87-9205-179acf53b17d\") " pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.821438 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.821391 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:25.851088 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.851053 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-r6v78" Apr 22 16:00:25.867138 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.867104 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp"] Apr 22 16:00:25.870615 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:00:25.870577 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34d9f048_671f_4c2d_bb28_faacd1ede66c.slice/crio-ce42c7105bc19a40d2f12fe95abc6be4be39210c2190ae309a6faa6c4f539d71 WatchSource:0}: Error finding container ce42c7105bc19a40d2f12fe95abc6be4be39210c2190ae309a6faa6c4f539d71: Status 404 returned error can't find the container with id ce42c7105bc19a40d2f12fe95abc6be4be39210c2190ae309a6faa6c4f539d71 Apr 22 16:00:25.892299 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.892248 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp" event={"ID":"34d9f048-671f-4c2d-bb28-faacd1ede66c","Type":"ContainerStarted","Data":"ce42c7105bc19a40d2f12fe95abc6be4be39210c2190ae309a6faa6c4f539d71"} Apr 22 16:00:25.966087 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:25.966050 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8956b97cd-qm8h9"] Apr 22 16:00:25.969871 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:00:25.969831 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4c85ad4_4658_4c87_9205_179acf53b17d.slice/crio-a2e89655711e988e79dd3c122aa9648d35c0e9cc1f63affe90938c11376b7c12 WatchSource:0}: Error finding container a2e89655711e988e79dd3c122aa9648d35c0e9cc1f63affe90938c11376b7c12: Status 404 returned error can't find the container with id a2e89655711e988e79dd3c122aa9648d35c0e9cc1f63affe90938c11376b7c12 Apr 22 16:00:26.001138 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:26.001092 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-r6v78"] Apr 22 16:00:26.004835 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:00:26.004802 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27117f2e_c258_4f10_8c53_d2baf49a8e53.slice/crio-65ae8eb46b6179a087d1f6886e7c1b2cfd63dc1892116f0165c17b28711ad727 WatchSource:0}: Error finding container 65ae8eb46b6179a087d1f6886e7c1b2cfd63dc1892116f0165c17b28711ad727: Status 404 returned error can't find the container with id 65ae8eb46b6179a087d1f6886e7c1b2cfd63dc1892116f0165c17b28711ad727 Apr 22 16:00:26.897227 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:26.897166 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-r6v78" event={"ID":"27117f2e-c258-4f10-8c53-d2baf49a8e53","Type":"ContainerStarted","Data":"533c21f537c9f07755e0f2a2a2e7c415824148afb86ce301474e2f14749f418e"} Apr 22 16:00:26.897227 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:26.897219 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-r6v78" event={"ID":"27117f2e-c258-4f10-8c53-d2baf49a8e53","Type":"ContainerStarted","Data":"65ae8eb46b6179a087d1f6886e7c1b2cfd63dc1892116f0165c17b28711ad727"} Apr 22 16:00:26.898989 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:26.898944 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" event={"ID":"c4c85ad4-4658-4c87-9205-179acf53b17d","Type":"ContainerStarted","Data":"f4c71b23d0ce7e8e6cd5851df2cfccd4c7bb080948e88bba33b8513734c600d8"} Apr 22 16:00:26.899138 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:26.898992 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" event={"ID":"c4c85ad4-4658-4c87-9205-179acf53b17d","Type":"ContainerStarted","Data":"a2e89655711e988e79dd3c122aa9648d35c0e9cc1f63affe90938c11376b7c12"} Apr 22 16:00:26.899138 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:26.899134 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:26.919284 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:26.918942 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" podStartSLOduration=1.9189173990000001 podStartE2EDuration="1.918917399s" podCreationTimestamp="2026-04-22 16:00:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:00:26.916742634 +0000 UTC m=+104.208931374" watchObservedRunningTime="2026-04-22 16:00:26.918917399 +0000 UTC m=+104.211106124" Apr 22 16:00:27.906498 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:27.906455 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-r6v78" event={"ID":"27117f2e-c258-4f10-8c53-d2baf49a8e53","Type":"ContainerStarted","Data":"5e1a9c2b8e4a441ad3163453dbe57a3c46ffd5580b8b171adfcc66e8be7c2476"} Apr 22 16:00:30.918763 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:30.918705 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-r6v78" event={"ID":"27117f2e-c258-4f10-8c53-d2baf49a8e53","Type":"ContainerStarted","Data":"90f29b1aa3b7c6c755792a9129d05f35c62d76ef5cbb1df72a73ca8c2eec58d6"} Apr 22 16:00:30.920049 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:30.920016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp" event={"ID":"34d9f048-671f-4c2d-bb28-faacd1ede66c","Type":"ContainerStarted","Data":"5c05fc0d76239ffc0fd3359f87c9adf8e4b13c6584b4a8da453875f23a4cb3b8"} Apr 22 16:00:30.920295 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:30.920277 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp" Apr 22 16:00:30.922175 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:30.922154 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp" Apr 22 16:00:30.935252 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:30.935202 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-r6v78" podStartSLOduration=1.8231970149999999 podStartE2EDuration="5.935186016s" podCreationTimestamp="2026-04-22 16:00:25 +0000 UTC" firstStartedPulling="2026-04-22 16:00:26.055687395 +0000 UTC m=+103.347876113" lastFinishedPulling="2026-04-22 16:00:30.167676412 +0000 UTC m=+107.459865114" observedRunningTime="2026-04-22 16:00:30.934069687 +0000 UTC m=+108.226258411" watchObservedRunningTime="2026-04-22 16:00:30.935186016 +0000 UTC m=+108.227374781" Apr 22 16:00:30.947453 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:30.947403 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d9d754dc9-fm5cp" podStartSLOduration=1.6406558100000002 podStartE2EDuration="5.947387505s" podCreationTimestamp="2026-04-22 16:00:25 +0000 UTC" firstStartedPulling="2026-04-22 16:00:25.873102786 +0000 UTC m=+103.165291487" lastFinishedPulling="2026-04-22 16:00:30.179834464 +0000 UTC m=+107.472023182" observedRunningTime="2026-04-22 16:00:30.946675184 +0000 UTC m=+108.238863913" watchObservedRunningTime="2026-04-22 16:00:30.947387505 +0000 UTC m=+108.239576226" Apr 22 16:00:32.879572 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:32.879517 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-njwkw" Apr 22 16:00:34.093630 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.093581 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tmcqz"] Apr 22 16:00:34.098808 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.098776 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.101282 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.101248 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 16:00:34.102132 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.101790 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 16:00:34.102132 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.101916 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 16:00:34.102132 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.102049 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 16:00:34.102393 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.102319 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-4m4rr\"" Apr 22 16:00:34.110357 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.110329 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rcqz5"] Apr 22 16:00:34.118043 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.118010 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.120468 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.120437 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 16:00:34.120772 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.120747 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 16:00:34.120957 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.120470 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 16:00:34.121217 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.120737 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-cw79t\"" Apr 22 16:00:34.121810 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.121764 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rcqz5"] Apr 22 16:00:34.238903 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.238860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-root\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.238903 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.238899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-tls\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.239168 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.238916 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-wtmp\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.239168 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.239004 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/490859ab-6bc0-4c2f-ad64-89b70d258a54-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.239168 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.239040 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-accelerators-collector-config\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.239168 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.239079 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/490859ab-6bc0-4c2f-ad64-89b70d258a54-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.239168 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.239112 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-metrics-client-ca\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.239168 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.239136 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/490859ab-6bc0-4c2f-ad64-89b70d258a54-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.239168 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.239158 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.239497 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.239216 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/490859ab-6bc0-4c2f-ad64-89b70d258a54-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.239497 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.239240 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7nxn\" (UniqueName: \"kubernetes.io/projected/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-kube-api-access-w7nxn\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.239497 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.239258 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-textfile\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.239497 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.239281 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-sys\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.239497 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.239309 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/490859ab-6bc0-4c2f-ad64-89b70d258a54-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.239497 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.239327 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8d7m\" (UniqueName: \"kubernetes.io/projected/490859ab-6bc0-4c2f-ad64-89b70d258a54-kube-api-access-q8d7m\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.340352 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.340300 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/490859ab-6bc0-4c2f-ad64-89b70d258a54-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.340561 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.340375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-metrics-client-ca\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.340561 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.340409 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/490859ab-6bc0-4c2f-ad64-89b70d258a54-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.340561 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.340438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.340561 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.340484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/490859ab-6bc0-4c2f-ad64-89b70d258a54-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.340561 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.340511 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7nxn\" (UniqueName: \"kubernetes.io/projected/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-kube-api-access-w7nxn\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.340561 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.340557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-textfile\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.340867 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.340591 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-sys\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.340867 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.340617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/490859ab-6bc0-4c2f-ad64-89b70d258a54-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.340867 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.340643 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8d7m\" (UniqueName: \"kubernetes.io/projected/490859ab-6bc0-4c2f-ad64-89b70d258a54-kube-api-access-q8d7m\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.340867 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.340674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-root\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.340867 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.340698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-tls\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.340867 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.340723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-wtmp\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.340867 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.340789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/490859ab-6bc0-4c2f-ad64-89b70d258a54-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.340867 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.340809 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/490859ab-6bc0-4c2f-ad64-89b70d258a54-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.340867 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.340822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-accelerators-collector-config\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.341274 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.341030 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-metrics-client-ca\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.341274 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:00:34.341045 2576 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 16:00:34.341274 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.341262 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-textfile\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.341409 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.341297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-sys\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.341409 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:00:34.341348 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/490859ab-6bc0-4c2f-ad64-89b70d258a54-kube-state-metrics-tls podName:490859ab-6bc0-4c2f-ad64-89b70d258a54 nodeName:}" failed. No retries permitted until 2026-04-22 16:00:34.841326303 +0000 UTC m=+112.133515044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/490859ab-6bc0-4c2f-ad64-89b70d258a54-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-rcqz5" (UID: "490859ab-6bc0-4c2f-ad64-89b70d258a54") : secret "kube-state-metrics-tls" not found Apr 22 16:00:34.341409 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:00:34.341388 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 16:00:34.341592 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:00:34.341467 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-tls podName:f20cdfa6-60b8-47c1-8cb5-a10b7f43e303 nodeName:}" failed. No retries permitted until 2026-04-22 16:00:34.84144902 +0000 UTC m=+112.133637728 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-tls") pod "node-exporter-tmcqz" (UID: "f20cdfa6-60b8-47c1-8cb5-a10b7f43e303") : secret "node-exporter-tls" not found Apr 22 16:00:34.341592 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.341503 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-root\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.341592 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.341565 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-wtmp\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.341760 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.341738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/490859ab-6bc0-4c2f-ad64-89b70d258a54-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.342005 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.341981 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-accelerators-collector-config\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.342187 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.342168 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/490859ab-6bc0-4c2f-ad64-89b70d258a54-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.344316 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.344261 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.344316 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.344268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/490859ab-6bc0-4c2f-ad64-89b70d258a54-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.352068 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.352002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8d7m\" (UniqueName: \"kubernetes.io/projected/490859ab-6bc0-4c2f-ad64-89b70d258a54-kube-api-access-q8d7m\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.352561 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.352513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7nxn\" (UniqueName: \"kubernetes.io/projected/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-kube-api-access-w7nxn\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.846302 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.846255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/490859ab-6bc0-4c2f-ad64-89b70d258a54-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.846504 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.846330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-tls\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:34.849108 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.849073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/490859ab-6bc0-4c2f-ad64-89b70d258a54-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rcqz5\" (UID: \"490859ab-6bc0-4c2f-ad64-89b70d258a54\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:34.849242 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:34.849076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f20cdfa6-60b8-47c1-8cb5-a10b7f43e303-node-exporter-tls\") pod \"node-exporter-tmcqz\" (UID: \"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303\") " pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:35.012374 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.012334 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tmcqz" Apr 22 16:00:35.022862 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:00:35.022830 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf20cdfa6_60b8_47c1_8cb5_a10b7f43e303.slice/crio-5744470a2b80617cfebfc469b17d92b2fbd7b30e977f09a8f6961703b8d6e698 WatchSource:0}: Error finding container 5744470a2b80617cfebfc469b17d92b2fbd7b30e977f09a8f6961703b8d6e698: Status 404 returned error can't find the container with id 5744470a2b80617cfebfc469b17d92b2fbd7b30e977f09a8f6961703b8d6e698 Apr 22 16:00:35.031181 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.031147 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" Apr 22 16:00:35.161981 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.161944 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:00:35.167359 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.167326 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.169801 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.169771 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 16:00:35.169801 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.169771 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 16:00:35.170024 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.169948 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 16:00:35.170024 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.169955 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 16:00:35.170199 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.170181 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 16:00:35.170375 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.170251 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 16:00:35.170375 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.170281 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 16:00:35.170375 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.170318 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 16:00:35.170375 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.170328 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 16:00:35.171152 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.171134 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-hcfqp\"" Apr 22 16:00:35.178683 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.178647 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rcqz5"] Apr 22 16:00:35.181678 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.181652 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:00:35.183810 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:00:35.183781 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod490859ab_6bc0_4c2f_ad64_89b70d258a54.slice/crio-44a7e93141966dec9dff0d46a5ae6815b9f997dadbc72a1f57840f35dc5e7ffa WatchSource:0}: Error finding container 44a7e93141966dec9dff0d46a5ae6815b9f997dadbc72a1f57840f35dc5e7ffa: Status 404 returned error can't find the container with id 44a7e93141966dec9dff0d46a5ae6815b9f997dadbc72a1f57840f35dc5e7ffa Apr 22 16:00:35.350370 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.350306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.350595 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.350387 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2b08d485-46a7-4141-b73d-2fc7c8604222-config-out\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.350595 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.350422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b08d485-46a7-4141-b73d-2fc7c8604222-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.350595 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.350459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.350595 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.350574 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.350808 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.350626 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.350808 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.350651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2b08d485-46a7-4141-b73d-2fc7c8604222-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.350808 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.350673 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-config-volume\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.350808 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.350688 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdjmf\" (UniqueName: \"kubernetes.io/projected/2b08d485-46a7-4141-b73d-2fc7c8604222-kube-api-access-cdjmf\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.350808 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.350715 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2b08d485-46a7-4141-b73d-2fc7c8604222-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.350808 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.350782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.351112 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.350817 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-web-config\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.351112 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.350854 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b08d485-46a7-4141-b73d-2fc7c8604222-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.451834 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.451790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.452013 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.451865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.452013 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.451893 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2b08d485-46a7-4141-b73d-2fc7c8604222-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.452013 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.451923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-config-volume\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.452013 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.451949 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdjmf\" (UniqueName: \"kubernetes.io/projected/2b08d485-46a7-4141-b73d-2fc7c8604222-kube-api-access-cdjmf\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.452013 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.451975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2b08d485-46a7-4141-b73d-2fc7c8604222-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.452013 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.452012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.452316 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.452043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-web-config\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.452316 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.452078 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b08d485-46a7-4141-b73d-2fc7c8604222-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.452316 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.452109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.452316 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.452182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2b08d485-46a7-4141-b73d-2fc7c8604222-config-out\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.452316 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.452225 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b08d485-46a7-4141-b73d-2fc7c8604222-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.452316 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.452263 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.452631 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.452392 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2b08d485-46a7-4141-b73d-2fc7c8604222-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.453669 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.453637 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b08d485-46a7-4141-b73d-2fc7c8604222-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.454855 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.454823 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b08d485-46a7-4141-b73d-2fc7c8604222-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.455425 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.455392 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.456505 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.456174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-config-volume\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.457152 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.457035 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2b08d485-46a7-4141-b73d-2fc7c8604222-config-out\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.457152 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.457080 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-web-config\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.457448 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.457428 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.457895 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.457866 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.458248 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.458219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.458365 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.458279 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2b08d485-46a7-4141-b73d-2fc7c8604222-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.458867 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.458837 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.460266 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.460240 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdjmf\" (UniqueName: \"kubernetes.io/projected/2b08d485-46a7-4141-b73d-2fc7c8604222-kube-api-access-cdjmf\") pod \"alertmanager-main-0\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.480664 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.480624 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:00:35.623364 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.623328 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:00:35.627913 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:00:35.627881 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b08d485_46a7_4141_b73d_2fc7c8604222.slice/crio-c664d0553d6a01cf91683802a6bdee4f8422fb55cdeefad4882cdebfca1d45a3 WatchSource:0}: Error finding container c664d0553d6a01cf91683802a6bdee4f8422fb55cdeefad4882cdebfca1d45a3: Status 404 returned error can't find the container with id c664d0553d6a01cf91683802a6bdee4f8422fb55cdeefad4882cdebfca1d45a3 Apr 22 16:00:35.936350 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.936271 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2b08d485-46a7-4141-b73d-2fc7c8604222","Type":"ContainerStarted","Data":"c664d0553d6a01cf91683802a6bdee4f8422fb55cdeefad4882cdebfca1d45a3"} Apr 22 16:00:35.938475 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.938428 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" event={"ID":"490859ab-6bc0-4c2f-ad64-89b70d258a54","Type":"ContainerStarted","Data":"44a7e93141966dec9dff0d46a5ae6815b9f997dadbc72a1f57840f35dc5e7ffa"} Apr 22 16:00:35.939962 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:35.939913 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tmcqz" event={"ID":"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303","Type":"ContainerStarted","Data":"5744470a2b80617cfebfc469b17d92b2fbd7b30e977f09a8f6961703b8d6e698"} Apr 22 16:00:36.945308 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:36.945273 2576 generic.go:358] "Generic (PLEG): container finished" podID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerID="521c7b0d27db4bc760599094afc41698d579fed128b754a84fa759c121bf0563" exitCode=0 Apr 22 16:00:36.945895 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:36.945378 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2b08d485-46a7-4141-b73d-2fc7c8604222","Type":"ContainerDied","Data":"521c7b0d27db4bc760599094afc41698d579fed128b754a84fa759c121bf0563"} Apr 22 16:00:36.948409 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:36.948349 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" event={"ID":"490859ab-6bc0-4c2f-ad64-89b70d258a54","Type":"ContainerStarted","Data":"12f73b759a8bd28f0eadb1235f682f35acce79741ba92713e7f4a69b313e1f8d"} Apr 22 16:00:36.951722 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:36.951599 2576 generic.go:358] "Generic (PLEG): container finished" podID="f20cdfa6-60b8-47c1-8cb5-a10b7f43e303" containerID="3d10f0f38d6267c5c8af47975890d83d713b2188c544bd6db1c193ec39bb2610" exitCode=0 Apr 22 16:00:36.951722 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:36.951669 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tmcqz" event={"ID":"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303","Type":"ContainerDied","Data":"3d10f0f38d6267c5c8af47975890d83d713b2188c544bd6db1c193ec39bb2610"} Apr 22 16:00:37.957065 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:37.957026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" event={"ID":"490859ab-6bc0-4c2f-ad64-89b70d258a54","Type":"ContainerStarted","Data":"fc779001684291d7ad2bea2a5cb3616931878138dd2ec262f87ee0a258ccbf94"} Apr 22 16:00:37.957572 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:37.957075 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" event={"ID":"490859ab-6bc0-4c2f-ad64-89b70d258a54","Type":"ContainerStarted","Data":"51e9ddfa781a33deec734c6ed55f776b07e9a9397f19d0ae8c5afc5567881305"} Apr 22 16:00:37.959270 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:37.959237 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tmcqz" event={"ID":"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303","Type":"ContainerStarted","Data":"43dde6a2718763449efcba7390cfe29dbe2b70d46ba6c38bf21c693cc4708b2d"} Apr 22 16:00:37.959390 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:37.959287 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tmcqz" event={"ID":"f20cdfa6-60b8-47c1-8cb5-a10b7f43e303","Type":"ContainerStarted","Data":"e86c3206aab37452be937e752412fd78a8438d56afbef97d50db7cc953e1e618"} Apr 22 16:00:37.974477 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:37.974425 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-rcqz5" podStartSLOduration=2.395711248 podStartE2EDuration="3.97441052s" podCreationTimestamp="2026-04-22 16:00:34 +0000 UTC" firstStartedPulling="2026-04-22 16:00:35.186479725 +0000 UTC m=+112.478668427" lastFinishedPulling="2026-04-22 16:00:36.765178997 +0000 UTC m=+114.057367699" observedRunningTime="2026-04-22 16:00:37.972511578 +0000 UTC m=+115.264700297" watchObservedRunningTime="2026-04-22 16:00:37.97441052 +0000 UTC m=+115.266599243" Apr 22 16:00:37.989217 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:37.989160 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tmcqz" podStartSLOduration=2.998145049 podStartE2EDuration="3.98912711s" podCreationTimestamp="2026-04-22 16:00:34 +0000 UTC" firstStartedPulling="2026-04-22 16:00:35.024508263 +0000 UTC m=+112.316696964" lastFinishedPulling="2026-04-22 16:00:36.01549031 +0000 UTC m=+113.307679025" observedRunningTime="2026-04-22 16:00:37.988265013 +0000 UTC m=+115.280453750" watchObservedRunningTime="2026-04-22 16:00:37.98912711 +0000 UTC m=+115.281315835" Apr 22 16:00:38.862696 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:38.862650 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-5pvh8"] Apr 22 16:00:38.865189 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:38.865163 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5pvh8" Apr 22 16:00:38.867370 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:38.867343 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-7h8zt\"" Apr 22 16:00:38.867518 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:38.867343 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 16:00:38.873128 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:38.873075 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-5pvh8"] Apr 22 16:00:38.881173 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:38.881120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/70f25860-e176-408f-8756-ca502709bcc8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-5pvh8\" (UID: \"70f25860-e176-408f-8756-ca502709bcc8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5pvh8" Apr 22 16:00:38.966712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:38.966601 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2b08d485-46a7-4141-b73d-2fc7c8604222","Type":"ContainerStarted","Data":"1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7"} Apr 22 16:00:38.966712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:38.966654 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2b08d485-46a7-4141-b73d-2fc7c8604222","Type":"ContainerStarted","Data":"9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58"} Apr 22 16:00:38.966712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:38.966668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2b08d485-46a7-4141-b73d-2fc7c8604222","Type":"ContainerStarted","Data":"c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5"} Apr 22 16:00:38.966712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:38.966681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2b08d485-46a7-4141-b73d-2fc7c8604222","Type":"ContainerStarted","Data":"a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06"} Apr 22 16:00:38.966712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:38.966694 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2b08d485-46a7-4141-b73d-2fc7c8604222","Type":"ContainerStarted","Data":"809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75"} Apr 22 16:00:38.982065 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:38.982017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/70f25860-e176-408f-8756-ca502709bcc8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-5pvh8\" (UID: \"70f25860-e176-408f-8756-ca502709bcc8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5pvh8" Apr 22 16:00:38.982257 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:00:38.982160 2576 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 22 16:00:38.982257 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:00:38.982225 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70f25860-e176-408f-8756-ca502709bcc8-monitoring-plugin-cert podName:70f25860-e176-408f-8756-ca502709bcc8 nodeName:}" failed. No retries permitted until 2026-04-22 16:00:39.482206146 +0000 UTC m=+116.774394863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/70f25860-e176-408f-8756-ca502709bcc8-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-5pvh8" (UID: "70f25860-e176-408f-8756-ca502709bcc8") : secret "monitoring-plugin-cert" not found Apr 22 16:00:39.486286 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:39.486252 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/70f25860-e176-408f-8756-ca502709bcc8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-5pvh8\" (UID: \"70f25860-e176-408f-8756-ca502709bcc8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5pvh8" Apr 22 16:00:39.488990 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:39.488952 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/70f25860-e176-408f-8756-ca502709bcc8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-5pvh8\" (UID: \"70f25860-e176-408f-8756-ca502709bcc8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5pvh8" Apr 22 16:00:39.779448 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:39.779334 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5pvh8" Apr 22 16:00:39.922748 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:39.922713 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-5pvh8"] Apr 22 16:00:39.926519 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:00:39.926477 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70f25860_e176_408f_8756_ca502709bcc8.slice/crio-937d24f6d59ae82fa24911cc899f586daad6d0913ebd785051cf564b6ec50059 WatchSource:0}: Error finding container 937d24f6d59ae82fa24911cc899f586daad6d0913ebd785051cf564b6ec50059: Status 404 returned error can't find the container with id 937d24f6d59ae82fa24911cc899f586daad6d0913ebd785051cf564b6ec50059 Apr 22 16:00:39.973319 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:39.973275 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2b08d485-46a7-4141-b73d-2fc7c8604222","Type":"ContainerStarted","Data":"f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711"} Apr 22 16:00:39.974483 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:39.974452 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5pvh8" event={"ID":"70f25860-e176-408f-8756-ca502709bcc8","Type":"ContainerStarted","Data":"937d24f6d59ae82fa24911cc899f586daad6d0913ebd785051cf564b6ec50059"} Apr 22 16:00:39.999415 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:39.999362 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.501350288 podStartE2EDuration="4.99934512s" podCreationTimestamp="2026-04-22 16:00:35 +0000 UTC" firstStartedPulling="2026-04-22 16:00:35.630503696 +0000 UTC m=+112.922692406" lastFinishedPulling="2026-04-22 16:00:39.128498536 +0000 UTC m=+116.420687238" observedRunningTime="2026-04-22 16:00:39.998267959 +0000 UTC m=+117.290456674" watchObservedRunningTime="2026-04-22 16:00:39.99934512 +0000 UTC m=+117.291533843" Apr 22 16:00:41.981712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:41.981672 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5pvh8" event={"ID":"70f25860-e176-408f-8756-ca502709bcc8","Type":"ContainerStarted","Data":"97165ef358c589225dc57b511d5aa94c17f3f2ad1e9a0a6e46c368802202e64b"} Apr 22 16:00:41.982143 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:41.981802 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5pvh8" Apr 22 16:00:41.986753 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:41.986724 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5pvh8" Apr 22 16:00:41.996540 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:41.996483 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5pvh8" podStartSLOduration=2.869574183 podStartE2EDuration="3.996462335s" podCreationTimestamp="2026-04-22 16:00:38 +0000 UTC" firstStartedPulling="2026-04-22 16:00:39.928872352 +0000 UTC m=+117.221061054" lastFinishedPulling="2026-04-22 16:00:41.055760504 +0000 UTC m=+118.347949206" observedRunningTime="2026-04-22 16:00:41.996309556 +0000 UTC m=+119.288498282" watchObservedRunningTime="2026-04-22 16:00:41.996462335 +0000 UTC m=+119.288651053" Apr 22 16:00:47.911445 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:47.911413 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8956b97cd-qm8h9" Apr 22 16:00:49.683495 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:49.683455 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-wnt9h"] Apr 22 16:00:49.685701 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:49.685679 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-wnt9h" Apr 22 16:00:49.688174 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:49.688145 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 16:00:49.688174 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:49.688167 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-9k8wh\"" Apr 22 16:00:49.688374 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:49.688193 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 16:00:49.696671 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:49.696638 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-wnt9h"] Apr 22 16:00:49.775963 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:49.775923 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr2pj\" (UniqueName: \"kubernetes.io/projected/4614c8ab-95b2-46ee-b4a6-3b5e1953986f-kube-api-access-nr2pj\") pod \"downloads-6bcc868b7-wnt9h\" (UID: \"4614c8ab-95b2-46ee-b4a6-3b5e1953986f\") " pod="openshift-console/downloads-6bcc868b7-wnt9h" Apr 22 16:00:49.876895 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:49.876855 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nr2pj\" (UniqueName: \"kubernetes.io/projected/4614c8ab-95b2-46ee-b4a6-3b5e1953986f-kube-api-access-nr2pj\") pod \"downloads-6bcc868b7-wnt9h\" (UID: \"4614c8ab-95b2-46ee-b4a6-3b5e1953986f\") " pod="openshift-console/downloads-6bcc868b7-wnt9h" Apr 22 16:00:49.885355 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:49.885315 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr2pj\" (UniqueName: \"kubernetes.io/projected/4614c8ab-95b2-46ee-b4a6-3b5e1953986f-kube-api-access-nr2pj\") pod \"downloads-6bcc868b7-wnt9h\" (UID: \"4614c8ab-95b2-46ee-b4a6-3b5e1953986f\") " pod="openshift-console/downloads-6bcc868b7-wnt9h" Apr 22 16:00:49.995518 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:49.995460 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-wnt9h" Apr 22 16:00:50.126670 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:50.126635 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-wnt9h"] Apr 22 16:00:50.130262 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:00:50.130216 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4614c8ab_95b2_46ee_b4a6_3b5e1953986f.slice/crio-81cee663e5921b793f41aad37bef138f57a3a9e0d3a330c62428ae0a0352b0a8 WatchSource:0}: Error finding container 81cee663e5921b793f41aad37bef138f57a3a9e0d3a330c62428ae0a0352b0a8: Status 404 returned error can't find the container with id 81cee663e5921b793f41aad37bef138f57a3a9e0d3a330c62428ae0a0352b0a8 Apr 22 16:00:51.009579 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:51.009513 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-wnt9h" event={"ID":"4614c8ab-95b2-46ee-b4a6-3b5e1953986f","Type":"ContainerStarted","Data":"81cee663e5921b793f41aad37bef138f57a3a9e0d3a330c62428ae0a0352b0a8"} Apr 22 16:00:52.097854 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:52.097809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs\") pod \"network-metrics-daemon-5v2vn\" (UID: \"13a488e0-8f15-4fd1-8913-c002ea52d186\") " pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 16:00:52.100587 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:52.100560 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13a488e0-8f15-4fd1-8913-c002ea52d186-metrics-certs\") pod \"network-metrics-daemon-5v2vn\" (UID: \"13a488e0-8f15-4fd1-8913-c002ea52d186\") " pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 16:00:52.401510 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:52.401422 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-52jwv\"" Apr 22 16:00:52.410397 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:52.410363 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5v2vn" Apr 22 16:00:52.568594 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:52.568558 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5v2vn"] Apr 22 16:00:52.571734 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:00:52.571698 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13a488e0_8f15_4fd1_8913_c002ea52d186.slice/crio-7fdfb1b7ec100e24904e1df94f316da344b722f131bb43edf5a70becbbf4641a WatchSource:0}: Error finding container 7fdfb1b7ec100e24904e1df94f316da344b722f131bb43edf5a70becbbf4641a: Status 404 returned error can't find the container with id 7fdfb1b7ec100e24904e1df94f316da344b722f131bb43edf5a70becbbf4641a Apr 22 16:00:53.016816 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:53.016777 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5v2vn" event={"ID":"13a488e0-8f15-4fd1-8913-c002ea52d186","Type":"ContainerStarted","Data":"7fdfb1b7ec100e24904e1df94f316da344b722f131bb43edf5a70becbbf4641a"} Apr 22 16:00:55.025248 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:55.025204 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5v2vn" event={"ID":"13a488e0-8f15-4fd1-8913-c002ea52d186","Type":"ContainerStarted","Data":"a9ce77a38c3642849a2894e3a01ce10d29b296ba4f90500e58f6626203eb17ed"} Apr 22 16:00:55.025248 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:55.025248 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5v2vn" event={"ID":"13a488e0-8f15-4fd1-8913-c002ea52d186","Type":"ContainerStarted","Data":"23092c21b88d4bc8bf84dae0016c3d399874be81807395e3321825ccfa7155ce"} Apr 22 16:00:55.040951 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:00:55.040895 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5v2vn" podStartSLOduration=130.54033585 podStartE2EDuration="2m12.040873833s" podCreationTimestamp="2026-04-22 15:58:43 +0000 UTC" firstStartedPulling="2026-04-22 16:00:52.573854697 +0000 UTC m=+129.866043402" lastFinishedPulling="2026-04-22 16:00:54.074392675 +0000 UTC m=+131.366581385" observedRunningTime="2026-04-22 16:00:55.038578072 +0000 UTC m=+132.330766793" watchObservedRunningTime="2026-04-22 16:00:55.040873833 +0000 UTC m=+132.333062559" Apr 22 16:01:10.074903 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:10.074858 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-wnt9h" event={"ID":"4614c8ab-95b2-46ee-b4a6-3b5e1953986f","Type":"ContainerStarted","Data":"a33f767b2c0f780148e60d97f88f001c42014ef582c6d70b0e8fa12d2daa8408"} Apr 22 16:01:10.075440 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:10.075415 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-wnt9h" Apr 22 16:01:10.090723 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:10.090685 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-wnt9h" Apr 22 16:01:10.092207 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:10.092094 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-wnt9h" podStartSLOduration=2.08801368 podStartE2EDuration="21.09207627s" podCreationTimestamp="2026-04-22 16:00:49 +0000 UTC" firstStartedPulling="2026-04-22 16:00:50.132145285 +0000 UTC m=+127.424333987" lastFinishedPulling="2026-04-22 16:01:09.136207875 +0000 UTC m=+146.428396577" observedRunningTime="2026-04-22 16:01:10.090622404 +0000 UTC m=+147.382811129" watchObservedRunningTime="2026-04-22 16:01:10.09207627 +0000 UTC m=+147.384264995" Apr 22 16:01:11.384414 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.384371 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-678d7cb6bd-h5wkz"] Apr 22 16:01:11.388423 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.388385 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.390977 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.390947 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 16:01:11.391866 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.391839 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 16:01:11.392002 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.391887 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 16:01:11.392002 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.391975 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-6gvpr\"" Apr 22 16:01:11.392247 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.392227 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 16:01:11.392492 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.392467 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 16:01:11.400426 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.400396 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 16:01:11.401455 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.401421 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-678d7cb6bd-h5wkz"] Apr 22 16:01:11.481410 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.481374 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-console-config\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.481410 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.481415 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-trusted-ca-bundle\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.481709 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.481440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr2rr\" (UniqueName: \"kubernetes.io/projected/3ae88b81-223c-41bb-8250-983061de4bae-kube-api-access-nr2rr\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.481709 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.481573 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ae88b81-223c-41bb-8250-983061de4bae-console-serving-cert\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.481709 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.481610 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ae88b81-223c-41bb-8250-983061de4bae-console-oauth-config\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.481870 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.481797 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-service-ca\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.481870 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.481858 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-oauth-serving-cert\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.582342 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.582294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-oauth-serving-cert\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.582342 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.582347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-console-config\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.582612 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.582374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-trusted-ca-bundle\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.582612 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.582403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nr2rr\" (UniqueName: \"kubernetes.io/projected/3ae88b81-223c-41bb-8250-983061de4bae-kube-api-access-nr2rr\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.582612 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.582463 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ae88b81-223c-41bb-8250-983061de4bae-console-serving-cert\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.582612 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.582485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ae88b81-223c-41bb-8250-983061de4bae-console-oauth-config\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.582612 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.582583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-service-ca\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.583168 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.583139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-console-config\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.583358 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.583219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-service-ca\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.583358 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.583231 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-trusted-ca-bundle\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.586246 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.586220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-oauth-serving-cert\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.586383 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.586244 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ae88b81-223c-41bb-8250-983061de4bae-console-serving-cert\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.586718 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.586697 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ae88b81-223c-41bb-8250-983061de4bae-console-oauth-config\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.591402 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.591377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr2rr\" (UniqueName: \"kubernetes.io/projected/3ae88b81-223c-41bb-8250-983061de4bae-kube-api-access-nr2rr\") pod \"console-678d7cb6bd-h5wkz\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.705131 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.705079 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:11.862432 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:11.862282 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-678d7cb6bd-h5wkz"] Apr 22 16:01:11.865692 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:01:11.865635 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ae88b81_223c_41bb_8250_983061de4bae.slice/crio-0f285c07908cbf5fd10e5493e3b3c51dbf87e03ddecdcb4233fac50c34b00112 WatchSource:0}: Error finding container 0f285c07908cbf5fd10e5493e3b3c51dbf87e03ddecdcb4233fac50c34b00112: Status 404 returned error can't find the container with id 0f285c07908cbf5fd10e5493e3b3c51dbf87e03ddecdcb4233fac50c34b00112 Apr 22 16:01:12.083038 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:12.082942 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-678d7cb6bd-h5wkz" event={"ID":"3ae88b81-223c-41bb-8250-983061de4bae","Type":"ContainerStarted","Data":"0f285c07908cbf5fd10e5493e3b3c51dbf87e03ddecdcb4233fac50c34b00112"} Apr 22 16:01:16.098110 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:16.098063 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-678d7cb6bd-h5wkz" event={"ID":"3ae88b81-223c-41bb-8250-983061de4bae","Type":"ContainerStarted","Data":"68bd710d63b86fd1de2b285da702c36032dbb8049ee9bbfed67b1dcb4f7126cd"} Apr 22 16:01:16.113125 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:16.113045 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-678d7cb6bd-h5wkz" podStartSLOduration=1.6575727489999998 podStartE2EDuration="5.113028039s" podCreationTimestamp="2026-04-22 16:01:11 +0000 UTC" firstStartedPulling="2026-04-22 16:01:11.868156127 +0000 UTC m=+149.160344829" lastFinishedPulling="2026-04-22 16:01:15.3236114 +0000 UTC m=+152.615800119" observedRunningTime="2026-04-22 16:01:16.11266688 +0000 UTC m=+153.404855605" watchObservedRunningTime="2026-04-22 16:01:16.113028039 +0000 UTC m=+153.405216765" Apr 22 16:01:21.706142 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:21.706101 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:21.706142 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:21.706141 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:21.711303 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:21.711275 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:22.122192 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:22.122100 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:01:54.469277 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:54.469192 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:01:54.470232 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:54.469874 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="alertmanager" containerID="cri-o://809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75" gracePeriod=120 Apr 22 16:01:54.470232 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:54.469928 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="kube-rbac-proxy" containerID="cri-o://9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58" gracePeriod=120 Apr 22 16:01:54.470232 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:54.470035 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="prom-label-proxy" containerID="cri-o://f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711" gracePeriod=120 Apr 22 16:01:54.470232 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:54.470084 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="kube-rbac-proxy-web" containerID="cri-o://c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5" gracePeriod=120 Apr 22 16:01:54.470232 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:54.470103 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="kube-rbac-proxy-metric" containerID="cri-o://1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7" gracePeriod=120 Apr 22 16:01:54.470232 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:54.470160 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="config-reloader" containerID="cri-o://a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06" gracePeriod=120 Apr 22 16:01:55.226687 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:55.226644 2576 generic.go:358] "Generic (PLEG): container finished" podID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerID="f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711" exitCode=0 Apr 22 16:01:55.226687 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:55.226672 2576 generic.go:358] "Generic (PLEG): container finished" podID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerID="9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58" exitCode=0 Apr 22 16:01:55.226687 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:55.226681 2576 generic.go:358] "Generic (PLEG): container finished" podID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerID="a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06" exitCode=0 Apr 22 16:01:55.226687 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:55.226686 2576 generic.go:358] "Generic (PLEG): container finished" podID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerID="809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75" exitCode=0 Apr 22 16:01:55.226960 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:55.226723 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2b08d485-46a7-4141-b73d-2fc7c8604222","Type":"ContainerDied","Data":"f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711"} Apr 22 16:01:55.226960 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:55.226761 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2b08d485-46a7-4141-b73d-2fc7c8604222","Type":"ContainerDied","Data":"9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58"} Apr 22 16:01:55.226960 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:55.226774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2b08d485-46a7-4141-b73d-2fc7c8604222","Type":"ContainerDied","Data":"a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06"} Apr 22 16:01:55.226960 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:55.226784 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2b08d485-46a7-4141-b73d-2fc7c8604222","Type":"ContainerDied","Data":"809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75"} Apr 22 16:01:56.223675 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.223647 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.232206 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.232174 2576 generic.go:358] "Generic (PLEG): container finished" podID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerID="1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7" exitCode=0 Apr 22 16:01:56.232206 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.232198 2576 generic.go:358] "Generic (PLEG): container finished" podID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerID="c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5" exitCode=0 Apr 22 16:01:56.232405 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.232280 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2b08d485-46a7-4141-b73d-2fc7c8604222","Type":"ContainerDied","Data":"1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7"} Apr 22 16:01:56.232405 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.232309 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2b08d485-46a7-4141-b73d-2fc7c8604222","Type":"ContainerDied","Data":"c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5"} Apr 22 16:01:56.232405 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.232320 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2b08d485-46a7-4141-b73d-2fc7c8604222","Type":"ContainerDied","Data":"c664d0553d6a01cf91683802a6bdee4f8422fb55cdeefad4882cdebfca1d45a3"} Apr 22 16:01:56.232405 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.232335 2576 scope.go:117] "RemoveContainer" containerID="f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711" Apr 22 16:01:56.232405 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.232339 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.241630 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.241603 2576 scope.go:117] "RemoveContainer" containerID="1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7" Apr 22 16:01:56.251008 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.250861 2576 scope.go:117] "RemoveContainer" containerID="9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58" Apr 22 16:01:56.261347 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.261316 2576 scope.go:117] "RemoveContainer" containerID="c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5" Apr 22 16:01:56.269292 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.269267 2576 scope.go:117] "RemoveContainer" containerID="a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06" Apr 22 16:01:56.277517 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.277491 2576 scope.go:117] "RemoveContainer" containerID="809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75" Apr 22 16:01:56.286675 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.286643 2576 scope.go:117] "RemoveContainer" containerID="521c7b0d27db4bc760599094afc41698d579fed128b754a84fa759c121bf0563" Apr 22 16:01:56.294910 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.294884 2576 scope.go:117] "RemoveContainer" containerID="f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711" Apr 22 16:01:56.295267 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:01:56.295244 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711\": container with ID starting with f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711 not found: ID does not exist" containerID="f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711" Apr 22 16:01:56.295373 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.295277 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711"} err="failed to get container status \"f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711\": rpc error: code = NotFound desc = could not find container \"f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711\": container with ID starting with f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711 not found: ID does not exist" Apr 22 16:01:56.295373 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.295318 2576 scope.go:117] "RemoveContainer" containerID="1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7" Apr 22 16:01:56.295657 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:01:56.295640 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7\": container with ID starting with 1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7 not found: ID does not exist" containerID="1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7" Apr 22 16:01:56.295712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.295662 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7"} err="failed to get container status \"1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7\": rpc error: code = NotFound desc = could not find container \"1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7\": container with ID starting with 1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7 not found: ID does not exist" Apr 22 16:01:56.295712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.295677 2576 scope.go:117] "RemoveContainer" containerID="9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58" Apr 22 16:01:56.295920 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:01:56.295905 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58\": container with ID starting with 9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58 not found: ID does not exist" containerID="9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58" Apr 22 16:01:56.295959 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.295928 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58"} err="failed to get container status \"9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58\": rpc error: code = NotFound desc = could not find container \"9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58\": container with ID starting with 9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58 not found: ID does not exist" Apr 22 16:01:56.295959 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.295943 2576 scope.go:117] "RemoveContainer" containerID="c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5" Apr 22 16:01:56.296221 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:01:56.296201 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5\": container with ID starting with c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5 not found: ID does not exist" containerID="c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5" Apr 22 16:01:56.296288 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.296230 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5"} err="failed to get container status \"c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5\": rpc error: code = NotFound desc = could not find container \"c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5\": container with ID starting with c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5 not found: ID does not exist" Apr 22 16:01:56.296288 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.296255 2576 scope.go:117] "RemoveContainer" containerID="a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06" Apr 22 16:01:56.296504 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:01:56.296482 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06\": container with ID starting with a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06 not found: ID does not exist" containerID="a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06" Apr 22 16:01:56.296561 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.296509 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06"} err="failed to get container status \"a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06\": rpc error: code = NotFound desc = could not find container \"a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06\": container with ID starting with a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06 not found: ID does not exist" Apr 22 16:01:56.296561 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.296541 2576 scope.go:117] "RemoveContainer" containerID="809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75" Apr 22 16:01:56.296772 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:01:56.296754 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75\": container with ID starting with 809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75 not found: ID does not exist" containerID="809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75" Apr 22 16:01:56.296848 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.296779 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75"} err="failed to get container status \"809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75\": rpc error: code = NotFound desc = could not find container \"809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75\": container with ID starting with 809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75 not found: ID does not exist" Apr 22 16:01:56.296848 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.296802 2576 scope.go:117] "RemoveContainer" containerID="521c7b0d27db4bc760599094afc41698d579fed128b754a84fa759c121bf0563" Apr 22 16:01:56.297085 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:01:56.297063 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"521c7b0d27db4bc760599094afc41698d579fed128b754a84fa759c121bf0563\": container with ID starting with 521c7b0d27db4bc760599094afc41698d579fed128b754a84fa759c121bf0563 not found: ID does not exist" containerID="521c7b0d27db4bc760599094afc41698d579fed128b754a84fa759c121bf0563" Apr 22 16:01:56.297128 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.297091 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"521c7b0d27db4bc760599094afc41698d579fed128b754a84fa759c121bf0563"} err="failed to get container status \"521c7b0d27db4bc760599094afc41698d579fed128b754a84fa759c121bf0563\": rpc error: code = NotFound desc = could not find container \"521c7b0d27db4bc760599094afc41698d579fed128b754a84fa759c121bf0563\": container with ID starting with 521c7b0d27db4bc760599094afc41698d579fed128b754a84fa759c121bf0563 not found: ID does not exist" Apr 22 16:01:56.297128 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.297108 2576 scope.go:117] "RemoveContainer" containerID="f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711" Apr 22 16:01:56.297323 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.297305 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711"} err="failed to get container status \"f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711\": rpc error: code = NotFound desc = could not find container \"f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711\": container with ID starting with f6d5c9be5cdf49319dadd7b0536c69f88e9b3428ae66984e5a1e064834784711 not found: ID does not exist" Apr 22 16:01:56.297364 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.297324 2576 scope.go:117] "RemoveContainer" containerID="1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7" Apr 22 16:01:56.297543 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.297513 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7"} err="failed to get container status \"1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7\": rpc error: code = NotFound desc = could not find container \"1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7\": container with ID starting with 1892b7c78db49814ec11b132127463dcfe923c37c80412d86ecef3c84057ffd7 not found: ID does not exist" Apr 22 16:01:56.297595 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.297546 2576 scope.go:117] "RemoveContainer" containerID="9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58" Apr 22 16:01:56.297743 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.297725 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58"} err="failed to get container status \"9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58\": rpc error: code = NotFound desc = could not find container \"9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58\": container with ID starting with 9c2d39e4ad694db78a4740d23f0a0129f4d15f0b0fc8954e0ffc6212fd5e2d58 not found: ID does not exist" Apr 22 16:01:56.297786 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.297743 2576 scope.go:117] "RemoveContainer" containerID="c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5" Apr 22 16:01:56.297960 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.297941 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5"} err="failed to get container status \"c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5\": rpc error: code = NotFound desc = could not find container \"c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5\": container with ID starting with c210ab4fcf19b0ea102ca5d06558f16d0ad1dd9e25a56dd8ee4d4cb964b104a5 not found: ID does not exist" Apr 22 16:01:56.298002 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.297962 2576 scope.go:117] "RemoveContainer" containerID="a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06" Apr 22 16:01:56.298143 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.298127 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06"} err="failed to get container status \"a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06\": rpc error: code = NotFound desc = could not find container \"a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06\": container with ID starting with a8fc295d990906fd31e202d5082039ac52ddc8777849c6d34986ff51be540d06 not found: ID does not exist" Apr 22 16:01:56.298192 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.298142 2576 scope.go:117] "RemoveContainer" containerID="809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75" Apr 22 16:01:56.298374 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.298342 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75"} err="failed to get container status \"809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75\": rpc error: code = NotFound desc = could not find container \"809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75\": container with ID starting with 809fadf84dc2363902021108917c43cf2fdc140f4f1e2523389c21e815cd7a75 not found: ID does not exist" Apr 22 16:01:56.298429 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.298376 2576 scope.go:117] "RemoveContainer" containerID="521c7b0d27db4bc760599094afc41698d579fed128b754a84fa759c121bf0563" Apr 22 16:01:56.298634 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.298614 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"521c7b0d27db4bc760599094afc41698d579fed128b754a84fa759c121bf0563"} err="failed to get container status \"521c7b0d27db4bc760599094afc41698d579fed128b754a84fa759c121bf0563\": rpc error: code = NotFound desc = could not find container \"521c7b0d27db4bc760599094afc41698d579fed128b754a84fa759c121bf0563\": container with ID starting with 521c7b0d27db4bc760599094afc41698d579fed128b754a84fa759c121bf0563 not found: ID does not exist" Apr 22 16:01:56.386037 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.385941 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2b08d485-46a7-4141-b73d-2fc7c8604222-tls-assets\") pod \"2b08d485-46a7-4141-b73d-2fc7c8604222\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " Apr 22 16:01:56.386037 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.386000 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2b08d485-46a7-4141-b73d-2fc7c8604222-config-out\") pod \"2b08d485-46a7-4141-b73d-2fc7c8604222\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " Apr 22 16:01:56.386037 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.386021 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-main-tls\") pod \"2b08d485-46a7-4141-b73d-2fc7c8604222\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " Apr 22 16:01:56.386037 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.386041 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-config-volume\") pod \"2b08d485-46a7-4141-b73d-2fc7c8604222\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " Apr 22 16:01:56.386355 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.386070 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdjmf\" (UniqueName: \"kubernetes.io/projected/2b08d485-46a7-4141-b73d-2fc7c8604222-kube-api-access-cdjmf\") pod \"2b08d485-46a7-4141-b73d-2fc7c8604222\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " Apr 22 16:01:56.386355 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.386091 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b08d485-46a7-4141-b73d-2fc7c8604222-metrics-client-ca\") pod \"2b08d485-46a7-4141-b73d-2fc7c8604222\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " Apr 22 16:01:56.386456 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.386391 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy-web\") pod \"2b08d485-46a7-4141-b73d-2fc7c8604222\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " Apr 22 16:01:56.386456 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.386431 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy\") pod \"2b08d485-46a7-4141-b73d-2fc7c8604222\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " Apr 22 16:01:56.386612 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.386464 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-cluster-tls-config\") pod \"2b08d485-46a7-4141-b73d-2fc7c8604222\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " Apr 22 16:01:56.386612 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.386509 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-web-config\") pod \"2b08d485-46a7-4141-b73d-2fc7c8604222\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " Apr 22 16:01:56.386612 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.386567 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b08d485-46a7-4141-b73d-2fc7c8604222-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "2b08d485-46a7-4141-b73d-2fc7c8604222" (UID: "2b08d485-46a7-4141-b73d-2fc7c8604222"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:01:56.386612 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.386575 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b08d485-46a7-4141-b73d-2fc7c8604222-alertmanager-trusted-ca-bundle\") pod \"2b08d485-46a7-4141-b73d-2fc7c8604222\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " Apr 22 16:01:56.386808 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.386653 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2b08d485-46a7-4141-b73d-2fc7c8604222-alertmanager-main-db\") pod \"2b08d485-46a7-4141-b73d-2fc7c8604222\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " Apr 22 16:01:56.386808 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.386702 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy-metric\") pod \"2b08d485-46a7-4141-b73d-2fc7c8604222\" (UID: \"2b08d485-46a7-4141-b73d-2fc7c8604222\") " Apr 22 16:01:56.386953 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.386927 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b08d485-46a7-4141-b73d-2fc7c8604222-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "2b08d485-46a7-4141-b73d-2fc7c8604222" (UID: "2b08d485-46a7-4141-b73d-2fc7c8604222"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:01:56.387669 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.387247 2576 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b08d485-46a7-4141-b73d-2fc7c8604222-metrics-client-ca\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:01:56.387669 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.387272 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b08d485-46a7-4141-b73d-2fc7c8604222-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:01:56.387865 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.387730 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b08d485-46a7-4141-b73d-2fc7c8604222-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "2b08d485-46a7-4141-b73d-2fc7c8604222" (UID: "2b08d485-46a7-4141-b73d-2fc7c8604222"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:01:56.389563 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.389464 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b08d485-46a7-4141-b73d-2fc7c8604222-kube-api-access-cdjmf" (OuterVolumeSpecName: "kube-api-access-cdjmf") pod "2b08d485-46a7-4141-b73d-2fc7c8604222" (UID: "2b08d485-46a7-4141-b73d-2fc7c8604222"). InnerVolumeSpecName "kube-api-access-cdjmf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:01:56.389709 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.389566 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b08d485-46a7-4141-b73d-2fc7c8604222-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "2b08d485-46a7-4141-b73d-2fc7c8604222" (UID: "2b08d485-46a7-4141-b73d-2fc7c8604222"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:01:56.390390 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.390352 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b08d485-46a7-4141-b73d-2fc7c8604222-config-out" (OuterVolumeSpecName: "config-out") pod "2b08d485-46a7-4141-b73d-2fc7c8604222" (UID: "2b08d485-46a7-4141-b73d-2fc7c8604222"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:01:56.390809 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.390775 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "2b08d485-46a7-4141-b73d-2fc7c8604222" (UID: "2b08d485-46a7-4141-b73d-2fc7c8604222"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:56.390935 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.390904 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "2b08d485-46a7-4141-b73d-2fc7c8604222" (UID: "2b08d485-46a7-4141-b73d-2fc7c8604222"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:56.390935 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.390903 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "2b08d485-46a7-4141-b73d-2fc7c8604222" (UID: "2b08d485-46a7-4141-b73d-2fc7c8604222"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:56.391123 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.391100 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-config-volume" (OuterVolumeSpecName: "config-volume") pod "2b08d485-46a7-4141-b73d-2fc7c8604222" (UID: "2b08d485-46a7-4141-b73d-2fc7c8604222"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:56.391364 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.391335 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "2b08d485-46a7-4141-b73d-2fc7c8604222" (UID: "2b08d485-46a7-4141-b73d-2fc7c8604222"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:56.395909 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.395880 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "2b08d485-46a7-4141-b73d-2fc7c8604222" (UID: "2b08d485-46a7-4141-b73d-2fc7c8604222"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:56.402151 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.402111 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-web-config" (OuterVolumeSpecName: "web-config") pod "2b08d485-46a7-4141-b73d-2fc7c8604222" (UID: "2b08d485-46a7-4141-b73d-2fc7c8604222"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:56.488262 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.488220 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2b08d485-46a7-4141-b73d-2fc7c8604222-tls-assets\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:01:56.488457 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.488333 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2b08d485-46a7-4141-b73d-2fc7c8604222-config-out\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:01:56.488457 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.488367 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-main-tls\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:01:56.488457 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.488382 2576 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-config-volume\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:01:56.488457 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.488395 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cdjmf\" (UniqueName: \"kubernetes.io/projected/2b08d485-46a7-4141-b73d-2fc7c8604222-kube-api-access-cdjmf\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:01:56.488457 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.488411 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:01:56.488457 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.488421 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:01:56.488457 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.488430 2576 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-cluster-tls-config\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:01:56.488457 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.488439 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-web-config\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:01:56.488457 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.488448 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2b08d485-46a7-4141-b73d-2fc7c8604222-alertmanager-main-db\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:01:56.488457 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.488461 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2b08d485-46a7-4141-b73d-2fc7c8604222-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:01:56.555045 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.555002 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:01:56.560068 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.560027 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:01:56.583507 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.583471 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:01:56.583839 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.583823 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="init-config-reloader" Apr 22 16:01:56.583892 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.583855 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="init-config-reloader" Apr 22 16:01:56.583892 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.583888 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="alertmanager" Apr 22 16:01:56.583953 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.583893 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="alertmanager" Apr 22 16:01:56.583953 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.583904 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="kube-rbac-proxy-web" Apr 22 16:01:56.583953 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.583909 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="kube-rbac-proxy-web" Apr 22 16:01:56.583953 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.583916 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="prom-label-proxy" Apr 22 16:01:56.583953 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.583921 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="prom-label-proxy" Apr 22 16:01:56.583953 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.583928 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="kube-rbac-proxy" Apr 22 16:01:56.583953 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.583933 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="kube-rbac-proxy" Apr 22 16:01:56.583953 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.583942 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="config-reloader" Apr 22 16:01:56.583953 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.583947 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="config-reloader" Apr 22 16:01:56.583953 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.583954 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="kube-rbac-proxy-metric" Apr 22 16:01:56.583953 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.583960 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="kube-rbac-proxy-metric" Apr 22 16:01:56.584284 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.584002 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="config-reloader" Apr 22 16:01:56.584284 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.584010 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="kube-rbac-proxy-web" Apr 22 16:01:56.584284 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.584017 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="prom-label-proxy" Apr 22 16:01:56.584284 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.584022 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="alertmanager" Apr 22 16:01:56.584284 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.584030 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="kube-rbac-proxy-metric" Apr 22 16:01:56.584284 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.584037 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" containerName="kube-rbac-proxy" Apr 22 16:01:56.589407 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.589387 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.591746 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.591722 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 16:01:56.592185 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.592168 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 16:01:56.592294 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.592265 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 16:01:56.592294 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.592279 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 16:01:56.592294 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.592265 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-hcfqp\"" Apr 22 16:01:56.592514 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.592303 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 16:01:56.592514 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.592349 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 16:01:56.592514 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.592265 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 16:01:56.592514 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.592388 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 16:01:56.597009 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.596990 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 16:01:56.605124 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.600832 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:01:56.690273 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.690238 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-config-volume\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.690273 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.690277 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4b3bf771-ec61-4f1b-b902-111ac6068823-config-out\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.690574 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.690307 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.690574 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.690326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4np6\" (UniqueName: \"kubernetes.io/projected/4b3bf771-ec61-4f1b-b902-111ac6068823-kube-api-access-h4np6\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.690574 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.690422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4b3bf771-ec61-4f1b-b902-111ac6068823-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.690574 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.690468 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.690574 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.690493 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b3bf771-ec61-4f1b-b902-111ac6068823-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.690574 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.690565 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.690864 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.690605 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.690864 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.690655 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-web-config\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.690864 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.690713 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b3bf771-ec61-4f1b-b902-111ac6068823-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.690864 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.690761 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.690864 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.690817 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4b3bf771-ec61-4f1b-b902-111ac6068823-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.791795 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.791756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.791795 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.791797 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4np6\" (UniqueName: \"kubernetes.io/projected/4b3bf771-ec61-4f1b-b902-111ac6068823-kube-api-access-h4np6\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.792028 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.791818 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4b3bf771-ec61-4f1b-b902-111ac6068823-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.792028 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.791837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.792028 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.791856 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b3bf771-ec61-4f1b-b902-111ac6068823-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.792028 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.792014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.792211 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.792071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.792211 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.792109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-web-config\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.792211 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.792155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b3bf771-ec61-4f1b-b902-111ac6068823-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.792357 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.792211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.792357 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.792265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4b3bf771-ec61-4f1b-b902-111ac6068823-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.792357 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.792283 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4b3bf771-ec61-4f1b-b902-111ac6068823-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.792357 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.792303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-config-volume\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.792357 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.792349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4b3bf771-ec61-4f1b-b902-111ac6068823-config-out\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.792699 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.792677 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b3bf771-ec61-4f1b-b902-111ac6068823-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.793702 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.793671 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b3bf771-ec61-4f1b-b902-111ac6068823-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.795357 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.795332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4b3bf771-ec61-4f1b-b902-111ac6068823-config-out\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.795607 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.795586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.795708 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.795670 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.795774 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.795729 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.795854 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.795829 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.796057 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.796032 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-web-config\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.796057 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.796042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.796549 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.796506 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4b3bf771-ec61-4f1b-b902-111ac6068823-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.798098 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.798071 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4b3bf771-ec61-4f1b-b902-111ac6068823-config-volume\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.799313 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.799296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4np6\" (UniqueName: \"kubernetes.io/projected/4b3bf771-ec61-4f1b-b902-111ac6068823-kube-api-access-h4np6\") pod \"alertmanager-main-0\" (UID: \"4b3bf771-ec61-4f1b-b902-111ac6068823\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:56.900450 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:56.900409 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:01:57.043410 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:57.043385 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:01:57.045853 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:01:57.045826 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b3bf771_ec61_4f1b_b902_111ac6068823.slice/crio-fa8f086b24ce378a8ae7be440d41a1e0ca6fa6b52bfcd7f0dc630c71e23dbe04 WatchSource:0}: Error finding container fa8f086b24ce378a8ae7be440d41a1e0ca6fa6b52bfcd7f0dc630c71e23dbe04: Status 404 returned error can't find the container with id fa8f086b24ce378a8ae7be440d41a1e0ca6fa6b52bfcd7f0dc630c71e23dbe04 Apr 22 16:01:57.238019 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:57.237929 2576 generic.go:358] "Generic (PLEG): container finished" podID="4b3bf771-ec61-4f1b-b902-111ac6068823" containerID="a00aee3c081a02a424ad3b62773c02fe960f4a32cd10563dfdb423c1d11c0c5d" exitCode=0 Apr 22 16:01:57.238359 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:57.238018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4b3bf771-ec61-4f1b-b902-111ac6068823","Type":"ContainerDied","Data":"a00aee3c081a02a424ad3b62773c02fe960f4a32cd10563dfdb423c1d11c0c5d"} Apr 22 16:01:57.238359 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:57.238055 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4b3bf771-ec61-4f1b-b902-111ac6068823","Type":"ContainerStarted","Data":"fa8f086b24ce378a8ae7be440d41a1e0ca6fa6b52bfcd7f0dc630c71e23dbe04"} Apr 22 16:01:57.393375 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:57.393340 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b08d485-46a7-4141-b73d-2fc7c8604222" path="/var/lib/kubelet/pods/2b08d485-46a7-4141-b73d-2fc7c8604222/volumes" Apr 22 16:01:58.245231 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:58.245178 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4b3bf771-ec61-4f1b-b902-111ac6068823","Type":"ContainerStarted","Data":"d8d3c9bddea53f8330b91f21a428fed9ef3a9a06361acdef5fce8805c9012e42"} Apr 22 16:01:58.245231 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:58.245233 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4b3bf771-ec61-4f1b-b902-111ac6068823","Type":"ContainerStarted","Data":"761bfc99e5cbe0194745b5f2604bf8aa1a057b1e540380d53959cd44505c1075"} Apr 22 16:01:58.245709 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:58.245243 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4b3bf771-ec61-4f1b-b902-111ac6068823","Type":"ContainerStarted","Data":"e680d03770ca0876ccae6a6f8db4ea08bc87dcd845cb56cc559a022d3f7e5c27"} Apr 22 16:01:58.245709 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:58.245252 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4b3bf771-ec61-4f1b-b902-111ac6068823","Type":"ContainerStarted","Data":"1c3072d5ea1368ed059a1254900178a992f8a17cd47c65583d9c914a461699b7"} Apr 22 16:01:58.245709 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:58.245260 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4b3bf771-ec61-4f1b-b902-111ac6068823","Type":"ContainerStarted","Data":"9e33ed8e22a92d6a9230ec49ce371d72f3738320056bad8faa1d3f0b71bb6f65"} Apr 22 16:01:58.245709 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:58.245268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4b3bf771-ec61-4f1b-b902-111ac6068823","Type":"ContainerStarted","Data":"4602dcea41da35a0b2898b35cd8b08f0d605a32d6e4ebb47b9d1c98766086a13"} Apr 22 16:01:58.271273 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:01:58.271166 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.271143191 podStartE2EDuration="2.271143191s" podCreationTimestamp="2026-04-22 16:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:01:58.268489693 +0000 UTC m=+195.560678405" watchObservedRunningTime="2026-04-22 16:01:58.271143191 +0000 UTC m=+195.563331918" Apr 22 16:02:08.364287 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:08.364237 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-678d7cb6bd-h5wkz"] Apr 22 16:02:29.998999 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:29.998963 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-tv4kw"] Apr 22 16:02:30.002298 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:30.002276 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tv4kw" Apr 22 16:02:30.004796 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:30.004765 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 16:02:30.006996 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:30.006968 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tv4kw"] Apr 22 16:02:30.081892 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:30.081853 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b41e7195-65f0-4477-b983-c3847499b874-kubelet-config\") pod \"global-pull-secret-syncer-tv4kw\" (UID: \"b41e7195-65f0-4477-b983-c3847499b874\") " pod="kube-system/global-pull-secret-syncer-tv4kw" Apr 22 16:02:30.082062 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:30.081920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b41e7195-65f0-4477-b983-c3847499b874-dbus\") pod \"global-pull-secret-syncer-tv4kw\" (UID: \"b41e7195-65f0-4477-b983-c3847499b874\") " pod="kube-system/global-pull-secret-syncer-tv4kw" Apr 22 16:02:30.082062 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:30.081952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b41e7195-65f0-4477-b983-c3847499b874-original-pull-secret\") pod \"global-pull-secret-syncer-tv4kw\" (UID: \"b41e7195-65f0-4477-b983-c3847499b874\") " pod="kube-system/global-pull-secret-syncer-tv4kw" Apr 22 16:02:30.183150 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:30.183107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b41e7195-65f0-4477-b983-c3847499b874-kubelet-config\") pod \"global-pull-secret-syncer-tv4kw\" (UID: \"b41e7195-65f0-4477-b983-c3847499b874\") " pod="kube-system/global-pull-secret-syncer-tv4kw" Apr 22 16:02:30.183320 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:30.183161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b41e7195-65f0-4477-b983-c3847499b874-dbus\") pod \"global-pull-secret-syncer-tv4kw\" (UID: \"b41e7195-65f0-4477-b983-c3847499b874\") " pod="kube-system/global-pull-secret-syncer-tv4kw" Apr 22 16:02:30.183320 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:30.183188 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b41e7195-65f0-4477-b983-c3847499b874-original-pull-secret\") pod \"global-pull-secret-syncer-tv4kw\" (UID: \"b41e7195-65f0-4477-b983-c3847499b874\") " pod="kube-system/global-pull-secret-syncer-tv4kw" Apr 22 16:02:30.183320 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:30.183264 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b41e7195-65f0-4477-b983-c3847499b874-kubelet-config\") pod \"global-pull-secret-syncer-tv4kw\" (UID: \"b41e7195-65f0-4477-b983-c3847499b874\") " pod="kube-system/global-pull-secret-syncer-tv4kw" Apr 22 16:02:30.183426 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:30.183334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b41e7195-65f0-4477-b983-c3847499b874-dbus\") pod \"global-pull-secret-syncer-tv4kw\" (UID: \"b41e7195-65f0-4477-b983-c3847499b874\") " pod="kube-system/global-pull-secret-syncer-tv4kw" Apr 22 16:02:30.185765 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:30.185743 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b41e7195-65f0-4477-b983-c3847499b874-original-pull-secret\") pod \"global-pull-secret-syncer-tv4kw\" (UID: \"b41e7195-65f0-4477-b983-c3847499b874\") " pod="kube-system/global-pull-secret-syncer-tv4kw" Apr 22 16:02:30.313799 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:30.313691 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tv4kw" Apr 22 16:02:30.442483 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:30.442329 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tv4kw"] Apr 22 16:02:30.445251 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:02:30.445221 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb41e7195_65f0_4477_b983_c3847499b874.slice/crio-bf5170f99ef397341c4151feb53902ddae12f5c2ba2055e9cd7f29651d33cdee WatchSource:0}: Error finding container bf5170f99ef397341c4151feb53902ddae12f5c2ba2055e9cd7f29651d33cdee: Status 404 returned error can't find the container with id bf5170f99ef397341c4151feb53902ddae12f5c2ba2055e9cd7f29651d33cdee Apr 22 16:02:31.347776 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:31.347737 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tv4kw" event={"ID":"b41e7195-65f0-4477-b983-c3847499b874","Type":"ContainerStarted","Data":"bf5170f99ef397341c4151feb53902ddae12f5c2ba2055e9cd7f29651d33cdee"} Apr 22 16:02:33.385156 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:33.385088 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-678d7cb6bd-h5wkz" podUID="3ae88b81-223c-41bb-8250-983061de4bae" containerName="console" containerID="cri-o://68bd710d63b86fd1de2b285da702c36032dbb8049ee9bbfed67b1dcb4f7126cd" gracePeriod=15 Apr 22 16:02:34.712023 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.711992 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-678d7cb6bd-h5wkz_3ae88b81-223c-41bb-8250-983061de4bae/console/0.log" Apr 22 16:02:34.712455 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.712086 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:02:34.824345 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.824227 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ae88b81-223c-41bb-8250-983061de4bae-console-oauth-config\") pod \"3ae88b81-223c-41bb-8250-983061de4bae\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " Apr 22 16:02:34.824345 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.824280 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr2rr\" (UniqueName: \"kubernetes.io/projected/3ae88b81-223c-41bb-8250-983061de4bae-kube-api-access-nr2rr\") pod \"3ae88b81-223c-41bb-8250-983061de4bae\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " Apr 22 16:02:34.824345 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.824309 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-trusted-ca-bundle\") pod \"3ae88b81-223c-41bb-8250-983061de4bae\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " Apr 22 16:02:34.824345 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.824335 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-console-config\") pod \"3ae88b81-223c-41bb-8250-983061de4bae\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " Apr 22 16:02:34.824728 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.824380 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-oauth-serving-cert\") pod \"3ae88b81-223c-41bb-8250-983061de4bae\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " Apr 22 16:02:34.824728 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.824408 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-service-ca\") pod \"3ae88b81-223c-41bb-8250-983061de4bae\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " Apr 22 16:02:34.824728 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.824435 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ae88b81-223c-41bb-8250-983061de4bae-console-serving-cert\") pod \"3ae88b81-223c-41bb-8250-983061de4bae\" (UID: \"3ae88b81-223c-41bb-8250-983061de4bae\") " Apr 22 16:02:34.824878 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.824835 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3ae88b81-223c-41bb-8250-983061de4bae" (UID: "3ae88b81-223c-41bb-8250-983061de4bae"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:02:34.824878 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.824855 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3ae88b81-223c-41bb-8250-983061de4bae" (UID: "3ae88b81-223c-41bb-8250-983061de4bae"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:02:34.824989 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.824868 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-console-config" (OuterVolumeSpecName: "console-config") pod "3ae88b81-223c-41bb-8250-983061de4bae" (UID: "3ae88b81-223c-41bb-8250-983061de4bae"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:02:34.825266 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.825100 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-service-ca" (OuterVolumeSpecName: "service-ca") pod "3ae88b81-223c-41bb-8250-983061de4bae" (UID: "3ae88b81-223c-41bb-8250-983061de4bae"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:02:34.827004 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.826976 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ae88b81-223c-41bb-8250-983061de4bae-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3ae88b81-223c-41bb-8250-983061de4bae" (UID: "3ae88b81-223c-41bb-8250-983061de4bae"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:02:34.827113 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.827067 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae88b81-223c-41bb-8250-983061de4bae-kube-api-access-nr2rr" (OuterVolumeSpecName: "kube-api-access-nr2rr") pod "3ae88b81-223c-41bb-8250-983061de4bae" (UID: "3ae88b81-223c-41bb-8250-983061de4bae"). InnerVolumeSpecName "kube-api-access-nr2rr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:02:34.827173 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.827105 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ae88b81-223c-41bb-8250-983061de4bae-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3ae88b81-223c-41bb-8250-983061de4bae" (UID: "3ae88b81-223c-41bb-8250-983061de4bae"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:02:34.925684 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.925627 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-oauth-serving-cert\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:02:34.925684 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.925679 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-service-ca\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:02:34.925684 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.925690 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ae88b81-223c-41bb-8250-983061de4bae-console-serving-cert\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:02:34.925684 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.925699 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ae88b81-223c-41bb-8250-983061de4bae-console-oauth-config\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:02:34.925969 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.925710 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nr2rr\" (UniqueName: \"kubernetes.io/projected/3ae88b81-223c-41bb-8250-983061de4bae-kube-api-access-nr2rr\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:02:34.925969 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.925720 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-trusted-ca-bundle\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:02:34.925969 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:34.925730 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ae88b81-223c-41bb-8250-983061de4bae-console-config\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:02:35.362827 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:35.362774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tv4kw" event={"ID":"b41e7195-65f0-4477-b983-c3847499b874","Type":"ContainerStarted","Data":"c7af70e9cf6cffc5cd37d6af537300663bf01b996b6faf43a2df50c5927681ca"} Apr 22 16:02:35.364095 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:35.364075 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-678d7cb6bd-h5wkz_3ae88b81-223c-41bb-8250-983061de4bae/console/0.log" Apr 22 16:02:35.364208 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:35.364116 2576 generic.go:358] "Generic (PLEG): container finished" podID="3ae88b81-223c-41bb-8250-983061de4bae" containerID="68bd710d63b86fd1de2b285da702c36032dbb8049ee9bbfed67b1dcb4f7126cd" exitCode=2 Apr 22 16:02:35.364208 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:35.364175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-678d7cb6bd-h5wkz" event={"ID":"3ae88b81-223c-41bb-8250-983061de4bae","Type":"ContainerDied","Data":"68bd710d63b86fd1de2b285da702c36032dbb8049ee9bbfed67b1dcb4f7126cd"} Apr 22 16:02:35.364208 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:35.364180 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-678d7cb6bd-h5wkz" Apr 22 16:02:35.364208 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:35.364200 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-678d7cb6bd-h5wkz" event={"ID":"3ae88b81-223c-41bb-8250-983061de4bae","Type":"ContainerDied","Data":"0f285c07908cbf5fd10e5493e3b3c51dbf87e03ddecdcb4233fac50c34b00112"} Apr 22 16:02:35.364341 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:35.364215 2576 scope.go:117] "RemoveContainer" containerID="68bd710d63b86fd1de2b285da702c36032dbb8049ee9bbfed67b1dcb4f7126cd" Apr 22 16:02:35.372914 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:35.372881 2576 scope.go:117] "RemoveContainer" containerID="68bd710d63b86fd1de2b285da702c36032dbb8049ee9bbfed67b1dcb4f7126cd" Apr 22 16:02:35.373275 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:02:35.373251 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68bd710d63b86fd1de2b285da702c36032dbb8049ee9bbfed67b1dcb4f7126cd\": container with ID starting with 68bd710d63b86fd1de2b285da702c36032dbb8049ee9bbfed67b1dcb4f7126cd not found: ID does not exist" containerID="68bd710d63b86fd1de2b285da702c36032dbb8049ee9bbfed67b1dcb4f7126cd" Apr 22 16:02:35.373328 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:35.373285 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68bd710d63b86fd1de2b285da702c36032dbb8049ee9bbfed67b1dcb4f7126cd"} err="failed to get container status \"68bd710d63b86fd1de2b285da702c36032dbb8049ee9bbfed67b1dcb4f7126cd\": rpc error: code = NotFound desc = could not find container \"68bd710d63b86fd1de2b285da702c36032dbb8049ee9bbfed67b1dcb4f7126cd\": container with ID starting with 68bd710d63b86fd1de2b285da702c36032dbb8049ee9bbfed67b1dcb4f7126cd not found: ID does not exist" Apr 22 16:02:35.377257 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:35.377205 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-tv4kw" podStartSLOduration=2.224171481 podStartE2EDuration="6.377189211s" podCreationTimestamp="2026-04-22 16:02:29 +0000 UTC" firstStartedPulling="2026-04-22 16:02:30.447157655 +0000 UTC m=+227.739346370" lastFinishedPulling="2026-04-22 16:02:34.600175374 +0000 UTC m=+231.892364100" observedRunningTime="2026-04-22 16:02:35.376705441 +0000 UTC m=+232.668894165" watchObservedRunningTime="2026-04-22 16:02:35.377189211 +0000 UTC m=+232.669377934" Apr 22 16:02:35.391944 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:35.391911 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-678d7cb6bd-h5wkz"] Apr 22 16:02:35.392119 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:35.392043 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-678d7cb6bd-h5wkz"] Apr 22 16:02:37.392512 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:02:37.392474 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ae88b81-223c-41bb-8250-983061de4bae" path="/var/lib/kubelet/pods/3ae88b81-223c-41bb-8250-983061de4bae/volumes" Apr 22 16:03:43.263510 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:03:43.263477 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/ovn-acl-logging/0.log" Apr 22 16:03:43.264050 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:03:43.263477 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/ovn-acl-logging/0.log" Apr 22 16:03:43.271121 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:03:43.271089 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 16:04:36.051283 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.051186 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8"] Apr 22 16:04:36.051712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.051501 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ae88b81-223c-41bb-8250-983061de4bae" containerName="console" Apr 22 16:04:36.051712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.051513 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae88b81-223c-41bb-8250-983061de4bae" containerName="console" Apr 22 16:04:36.051712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.051586 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ae88b81-223c-41bb-8250-983061de4bae" containerName="console" Apr 22 16:04:36.054617 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.054596 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" Apr 22 16:04:36.057022 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.056992 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-jvd26\"" Apr 22 16:04:36.057181 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.057029 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 16:04:36.057744 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.057721 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 16:04:36.070799 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.070764 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8"] Apr 22 16:04:36.136803 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.136754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8\" (UID: \"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" Apr 22 16:04:36.137010 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.136833 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8\" (UID: \"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" Apr 22 16:04:36.137010 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.136857 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54sc4\" (UniqueName: \"kubernetes.io/projected/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-kube-api-access-54sc4\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8\" (UID: \"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" Apr 22 16:04:36.237434 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.237385 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54sc4\" (UniqueName: \"kubernetes.io/projected/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-kube-api-access-54sc4\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8\" (UID: \"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" Apr 22 16:04:36.237650 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.237456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8\" (UID: \"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" Apr 22 16:04:36.237650 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.237508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8\" (UID: \"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" Apr 22 16:04:36.237840 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.237820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8\" (UID: \"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" Apr 22 16:04:36.237879 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.237846 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8\" (UID: \"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" Apr 22 16:04:36.245457 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.245414 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54sc4\" (UniqueName: \"kubernetes.io/projected/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-kube-api-access-54sc4\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8\" (UID: \"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" Apr 22 16:04:36.364633 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.364504 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" Apr 22 16:04:36.495435 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.495395 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8"] Apr 22 16:04:36.498271 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:04:36.498238 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52cfbe4f_2ade_4ba2_b8b5_6c5bcb7ed349.slice/crio-b6e6f7a542f722d3c47dd4a7415f8329b5f7acdbdd227c53cbb6d1c9cbc7f66b WatchSource:0}: Error finding container b6e6f7a542f722d3c47dd4a7415f8329b5f7acdbdd227c53cbb6d1c9cbc7f66b: Status 404 returned error can't find the container with id b6e6f7a542f722d3c47dd4a7415f8329b5f7acdbdd227c53cbb6d1c9cbc7f66b Apr 22 16:04:36.500279 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.500254 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:04:36.718729 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:36.718689 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" event={"ID":"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349","Type":"ContainerStarted","Data":"b6e6f7a542f722d3c47dd4a7415f8329b5f7acdbdd227c53cbb6d1c9cbc7f66b"} Apr 22 16:04:44.749315 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:44.749275 2576 generic.go:358] "Generic (PLEG): container finished" podID="52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349" containerID="f086d5cd321e709a037f18e4562f27af58e78bb323ed06bce4cf768f95af4ae1" exitCode=0 Apr 22 16:04:44.749819 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:44.749351 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" event={"ID":"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349","Type":"ContainerDied","Data":"f086d5cd321e709a037f18e4562f27af58e78bb323ed06bce4cf768f95af4ae1"} Apr 22 16:04:46.759116 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:46.759083 2576 generic.go:358] "Generic (PLEG): container finished" podID="52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349" containerID="3bf1bd08e677c8f592278275a7c01bfc94c48406751ac88b3c4e4ebc91a1ae1b" exitCode=0 Apr 22 16:04:46.759539 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:46.759160 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" event={"ID":"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349","Type":"ContainerDied","Data":"3bf1bd08e677c8f592278275a7c01bfc94c48406751ac88b3c4e4ebc91a1ae1b"} Apr 22 16:04:53.782433 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:53.782392 2576 generic.go:358] "Generic (PLEG): container finished" podID="52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349" containerID="f5dc6fbc2b89f981d9db7532138d57a51e392a16f80992dce3ba4be8e8c1eefb" exitCode=0 Apr 22 16:04:53.782875 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:53.782477 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" event={"ID":"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349","Type":"ContainerDied","Data":"f5dc6fbc2b89f981d9db7532138d57a51e392a16f80992dce3ba4be8e8c1eefb"} Apr 22 16:04:54.920655 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:54.920625 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" Apr 22 16:04:55.012309 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:55.012268 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-bundle\") pod \"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349\" (UID: \"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349\") " Apr 22 16:04:55.012517 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:55.012327 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54sc4\" (UniqueName: \"kubernetes.io/projected/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-kube-api-access-54sc4\") pod \"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349\" (UID: \"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349\") " Apr 22 16:04:55.012517 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:55.012369 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-util\") pod \"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349\" (UID: \"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349\") " Apr 22 16:04:55.013064 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:55.013034 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-bundle" (OuterVolumeSpecName: "bundle") pod "52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349" (UID: "52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:04:55.014806 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:55.014761 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-kube-api-access-54sc4" (OuterVolumeSpecName: "kube-api-access-54sc4") pod "52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349" (UID: "52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349"). InnerVolumeSpecName "kube-api-access-54sc4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:04:55.016640 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:55.016605 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-util" (OuterVolumeSpecName: "util") pod "52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349" (UID: "52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:04:55.113349 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:55.113257 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-bundle\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:04:55.113349 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:55.113290 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-54sc4\" (UniqueName: \"kubernetes.io/projected/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-kube-api-access-54sc4\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:04:55.113349 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:55.113302 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349-util\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:04:55.789895 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:55.789858 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" event={"ID":"52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349","Type":"ContainerDied","Data":"b6e6f7a542f722d3c47dd4a7415f8329b5f7acdbdd227c53cbb6d1c9cbc7f66b"} Apr 22 16:04:55.789895 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:55.789896 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6e6f7a542f722d3c47dd4a7415f8329b5f7acdbdd227c53cbb6d1c9cbc7f66b" Apr 22 16:04:55.790124 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:55.789940 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d47cq8" Apr 22 16:04:58.888216 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:58.888163 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-dgs4l"] Apr 22 16:04:58.889431 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:58.889395 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349" containerName="pull" Apr 22 16:04:58.889431 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:58.889432 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349" containerName="pull" Apr 22 16:04:58.889687 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:58.889452 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349" containerName="extract" Apr 22 16:04:58.889687 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:58.889460 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349" containerName="extract" Apr 22 16:04:58.889687 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:58.889487 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349" containerName="util" Apr 22 16:04:58.889687 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:58.889495 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349" containerName="util" Apr 22 16:04:58.889855 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:58.889709 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="52cfbe4f-2ade-4ba2-b8b5-6c5bcb7ed349" containerName="extract" Apr 22 16:04:58.950885 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:58.950833 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-dgs4l"] Apr 22 16:04:58.951061 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:58.950973 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-dgs4l" Apr 22 16:04:58.954091 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:58.954068 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 22 16:04:58.954257 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:58.954067 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:04:58.954257 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:58.954198 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-d7sjv\"" Apr 22 16:04:59.042484 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:59.042444 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/96960472-df50-44b8-bdd4-d4248d22f6b6-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-dgs4l\" (UID: \"96960472-df50-44b8-bdd4-d4248d22f6b6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-dgs4l" Apr 22 16:04:59.042484 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:59.042486 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2drn\" (UniqueName: \"kubernetes.io/projected/96960472-df50-44b8-bdd4-d4248d22f6b6-kube-api-access-q2drn\") pod \"cert-manager-operator-controller-manager-54b9655956-dgs4l\" (UID: \"96960472-df50-44b8-bdd4-d4248d22f6b6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-dgs4l" Apr 22 16:04:59.143375 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:59.143261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/96960472-df50-44b8-bdd4-d4248d22f6b6-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-dgs4l\" (UID: \"96960472-df50-44b8-bdd4-d4248d22f6b6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-dgs4l" Apr 22 16:04:59.143375 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:59.143317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2drn\" (UniqueName: \"kubernetes.io/projected/96960472-df50-44b8-bdd4-d4248d22f6b6-kube-api-access-q2drn\") pod \"cert-manager-operator-controller-manager-54b9655956-dgs4l\" (UID: \"96960472-df50-44b8-bdd4-d4248d22f6b6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-dgs4l" Apr 22 16:04:59.143746 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:59.143720 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/96960472-df50-44b8-bdd4-d4248d22f6b6-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-dgs4l\" (UID: \"96960472-df50-44b8-bdd4-d4248d22f6b6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-dgs4l" Apr 22 16:04:59.151718 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:59.151679 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2drn\" (UniqueName: \"kubernetes.io/projected/96960472-df50-44b8-bdd4-d4248d22f6b6-kube-api-access-q2drn\") pod \"cert-manager-operator-controller-manager-54b9655956-dgs4l\" (UID: \"96960472-df50-44b8-bdd4-d4248d22f6b6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-dgs4l" Apr 22 16:04:59.260724 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:59.260680 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-dgs4l" Apr 22 16:04:59.402446 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:59.402282 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-dgs4l"] Apr 22 16:04:59.405667 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:04:59.405638 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96960472_df50_44b8_bdd4_d4248d22f6b6.slice/crio-11487a453269f58c58c4117e0820465f86e77d0d31c5d1391bd0ab006af1c1a9 WatchSource:0}: Error finding container 11487a453269f58c58c4117e0820465f86e77d0d31c5d1391bd0ab006af1c1a9: Status 404 returned error can't find the container with id 11487a453269f58c58c4117e0820465f86e77d0d31c5d1391bd0ab006af1c1a9 Apr 22 16:04:59.803545 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:04:59.803493 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-dgs4l" event={"ID":"96960472-df50-44b8-bdd4-d4248d22f6b6","Type":"ContainerStarted","Data":"11487a453269f58c58c4117e0820465f86e77d0d31c5d1391bd0ab006af1c1a9"} Apr 22 16:05:01.812826 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:01.812785 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-dgs4l" event={"ID":"96960472-df50-44b8-bdd4-d4248d22f6b6","Type":"ContainerStarted","Data":"fa7e3f11d1627911c2581655315dce6d1c969e1f97270e44f56ce9ccbd4ca693"} Apr 22 16:05:01.834055 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:01.833983 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-dgs4l" podStartSLOduration=1.824075053 podStartE2EDuration="3.833957198s" podCreationTimestamp="2026-04-22 16:04:58 +0000 UTC" firstStartedPulling="2026-04-22 16:04:59.408307199 +0000 UTC m=+376.700495901" lastFinishedPulling="2026-04-22 16:05:01.418189343 +0000 UTC m=+378.710378046" observedRunningTime="2026-04-22 16:05:01.830756536 +0000 UTC m=+379.122945260" watchObservedRunningTime="2026-04-22 16:05:01.833957198 +0000 UTC m=+379.126145925" Apr 22 16:05:07.741980 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:07.741929 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-4kbwr"] Apr 22 16:05:07.744801 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:07.744773 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-4kbwr" Apr 22 16:05:07.747359 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:07.747323 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 16:05:07.747508 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:07.747329 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 16:05:07.748109 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:07.748091 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-8mpts\"" Apr 22 16:05:07.751832 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:07.751797 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-4kbwr"] Apr 22 16:05:07.806892 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:07.806839 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ql6q\" (UniqueName: \"kubernetes.io/projected/4715b382-f0bb-49df-9686-26433ad0333b-kube-api-access-2ql6q\") pod \"cert-manager-cainjector-68b757865b-4kbwr\" (UID: \"4715b382-f0bb-49df-9686-26433ad0333b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4kbwr" Apr 22 16:05:07.806892 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:07.806897 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4715b382-f0bb-49df-9686-26433ad0333b-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-4kbwr\" (UID: \"4715b382-f0bb-49df-9686-26433ad0333b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4kbwr" Apr 22 16:05:07.907396 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:07.907354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ql6q\" (UniqueName: \"kubernetes.io/projected/4715b382-f0bb-49df-9686-26433ad0333b-kube-api-access-2ql6q\") pod \"cert-manager-cainjector-68b757865b-4kbwr\" (UID: \"4715b382-f0bb-49df-9686-26433ad0333b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4kbwr" Apr 22 16:05:07.907396 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:07.907405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4715b382-f0bb-49df-9686-26433ad0333b-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-4kbwr\" (UID: \"4715b382-f0bb-49df-9686-26433ad0333b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4kbwr" Apr 22 16:05:07.914667 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:07.914636 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4715b382-f0bb-49df-9686-26433ad0333b-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-4kbwr\" (UID: \"4715b382-f0bb-49df-9686-26433ad0333b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4kbwr" Apr 22 16:05:07.914824 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:07.914735 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ql6q\" (UniqueName: \"kubernetes.io/projected/4715b382-f0bb-49df-9686-26433ad0333b-kube-api-access-2ql6q\") pod \"cert-manager-cainjector-68b757865b-4kbwr\" (UID: \"4715b382-f0bb-49df-9686-26433ad0333b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4kbwr" Apr 22 16:05:08.064757 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:08.064655 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-4kbwr" Apr 22 16:05:08.196784 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:08.196741 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-4kbwr"] Apr 22 16:05:08.200093 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:05:08.200063 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4715b382_f0bb_49df_9686_26433ad0333b.slice/crio-4ee648078ab94d625483752ff7fc7c317fa47463160b75967fc9f5c051bdfa7c WatchSource:0}: Error finding container 4ee648078ab94d625483752ff7fc7c317fa47463160b75967fc9f5c051bdfa7c: Status 404 returned error can't find the container with id 4ee648078ab94d625483752ff7fc7c317fa47463160b75967fc9f5c051bdfa7c Apr 22 16:05:08.837423 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:08.837388 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-4kbwr" event={"ID":"4715b382-f0bb-49df-9686-26433ad0333b","Type":"ContainerStarted","Data":"4ee648078ab94d625483752ff7fc7c317fa47463160b75967fc9f5c051bdfa7c"} Apr 22 16:05:10.846071 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:10.846029 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-4kbwr" event={"ID":"4715b382-f0bb-49df-9686-26433ad0333b","Type":"ContainerStarted","Data":"061f9b996ceedbb911247137607b747fe593b440b12faac8c0570208fd739fb6"} Apr 22 16:05:10.863068 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:10.863012 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-4kbwr" podStartSLOduration=1.354301253 podStartE2EDuration="3.862997051s" podCreationTimestamp="2026-04-22 16:05:07 +0000 UTC" firstStartedPulling="2026-04-22 16:05:08.202414916 +0000 UTC m=+385.494603618" lastFinishedPulling="2026-04-22 16:05:10.71111071 +0000 UTC m=+388.003299416" observedRunningTime="2026-04-22 16:05:10.861505267 +0000 UTC m=+388.153693988" watchObservedRunningTime="2026-04-22 16:05:10.862997051 +0000 UTC m=+388.155185774" Apr 22 16:05:33.237362 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:33.237321 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wgxq2"] Apr 22 16:05:33.239646 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:33.239627 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wgxq2" Apr 22 16:05:33.241870 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:33.241840 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:05:33.242049 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:33.241841 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-8s5vw\"" Apr 22 16:05:33.242585 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:33.242569 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 16:05:33.247474 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:33.247448 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wgxq2"] Apr 22 16:05:33.333400 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:33.333358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d5bb611c-c5a3-46c4-83d0-19da2d69dc62-tmp\") pod \"openshift-lws-operator-bfc7f696d-wgxq2\" (UID: \"d5bb611c-c5a3-46c4-83d0-19da2d69dc62\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wgxq2" Apr 22 16:05:33.333611 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:33.333446 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gv8w\" (UniqueName: \"kubernetes.io/projected/d5bb611c-c5a3-46c4-83d0-19da2d69dc62-kube-api-access-2gv8w\") pod \"openshift-lws-operator-bfc7f696d-wgxq2\" (UID: \"d5bb611c-c5a3-46c4-83d0-19da2d69dc62\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wgxq2" Apr 22 16:05:33.434351 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:33.434311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d5bb611c-c5a3-46c4-83d0-19da2d69dc62-tmp\") pod \"openshift-lws-operator-bfc7f696d-wgxq2\" (UID: \"d5bb611c-c5a3-46c4-83d0-19da2d69dc62\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wgxq2" Apr 22 16:05:33.434582 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:33.434404 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gv8w\" (UniqueName: \"kubernetes.io/projected/d5bb611c-c5a3-46c4-83d0-19da2d69dc62-kube-api-access-2gv8w\") pod \"openshift-lws-operator-bfc7f696d-wgxq2\" (UID: \"d5bb611c-c5a3-46c4-83d0-19da2d69dc62\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wgxq2" Apr 22 16:05:33.434798 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:33.434774 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d5bb611c-c5a3-46c4-83d0-19da2d69dc62-tmp\") pod \"openshift-lws-operator-bfc7f696d-wgxq2\" (UID: \"d5bb611c-c5a3-46c4-83d0-19da2d69dc62\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wgxq2" Apr 22 16:05:33.442696 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:33.442665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gv8w\" (UniqueName: \"kubernetes.io/projected/d5bb611c-c5a3-46c4-83d0-19da2d69dc62-kube-api-access-2gv8w\") pod \"openshift-lws-operator-bfc7f696d-wgxq2\" (UID: \"d5bb611c-c5a3-46c4-83d0-19da2d69dc62\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wgxq2" Apr 22 16:05:33.549449 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:33.549344 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wgxq2" Apr 22 16:05:33.683611 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:33.683583 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wgxq2"] Apr 22 16:05:33.686265 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:05:33.686233 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5bb611c_c5a3_46c4_83d0_19da2d69dc62.slice/crio-cf80d5824441be71dd17b051cca7b88e22fe60e27261a5ee0507b4052e8bf5a1 WatchSource:0}: Error finding container cf80d5824441be71dd17b051cca7b88e22fe60e27261a5ee0507b4052e8bf5a1: Status 404 returned error can't find the container with id cf80d5824441be71dd17b051cca7b88e22fe60e27261a5ee0507b4052e8bf5a1 Apr 22 16:05:33.927160 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:33.927053 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wgxq2" event={"ID":"d5bb611c-c5a3-46c4-83d0-19da2d69dc62","Type":"ContainerStarted","Data":"cf80d5824441be71dd17b051cca7b88e22fe60e27261a5ee0507b4052e8bf5a1"} Apr 22 16:05:36.940360 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:36.940316 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wgxq2" event={"ID":"d5bb611c-c5a3-46c4-83d0-19da2d69dc62","Type":"ContainerStarted","Data":"29e6a0bc7fb0f9d9cd19c79e5651b9723980eb5ce31f84b0109afa65848962a1"} Apr 22 16:05:36.962918 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:36.962863 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wgxq2" podStartSLOduration=1.137970883 podStartE2EDuration="3.962845613s" podCreationTimestamp="2026-04-22 16:05:33 +0000 UTC" firstStartedPulling="2026-04-22 16:05:33.687856814 +0000 UTC m=+410.980045516" lastFinishedPulling="2026-04-22 16:05:36.512731525 +0000 UTC m=+413.804920246" observedRunningTime="2026-04-22 16:05:36.960438635 +0000 UTC m=+414.252627360" watchObservedRunningTime="2026-04-22 16:05:36.962845613 +0000 UTC m=+414.255034336" Apr 22 16:05:40.408283 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.408240 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7"] Apr 22 16:05:40.411168 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.411148 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" Apr 22 16:05:40.413558 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.413505 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 16:05:40.413558 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.413523 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 16:05:40.413767 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.413515 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-jvd26\"" Apr 22 16:05:40.418231 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.418191 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7"] Apr 22 16:05:40.495090 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.495047 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51f1a339-d1b2-447b-90ed-2943bebd1807-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7\" (UID: \"51f1a339-d1b2-447b-90ed-2943bebd1807\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" Apr 22 16:05:40.495090 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.495095 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51f1a339-d1b2-447b-90ed-2943bebd1807-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7\" (UID: \"51f1a339-d1b2-447b-90ed-2943bebd1807\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" Apr 22 16:05:40.495328 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.495177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h644\" (UniqueName: \"kubernetes.io/projected/51f1a339-d1b2-447b-90ed-2943bebd1807-kube-api-access-6h644\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7\" (UID: \"51f1a339-d1b2-447b-90ed-2943bebd1807\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" Apr 22 16:05:40.596426 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.596381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51f1a339-d1b2-447b-90ed-2943bebd1807-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7\" (UID: \"51f1a339-d1b2-447b-90ed-2943bebd1807\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" Apr 22 16:05:40.596673 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.596432 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51f1a339-d1b2-447b-90ed-2943bebd1807-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7\" (UID: \"51f1a339-d1b2-447b-90ed-2943bebd1807\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" Apr 22 16:05:40.596673 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.596468 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6h644\" (UniqueName: \"kubernetes.io/projected/51f1a339-d1b2-447b-90ed-2943bebd1807-kube-api-access-6h644\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7\" (UID: \"51f1a339-d1b2-447b-90ed-2943bebd1807\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" Apr 22 16:05:40.596877 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.596850 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51f1a339-d1b2-447b-90ed-2943bebd1807-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7\" (UID: \"51f1a339-d1b2-447b-90ed-2943bebd1807\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" Apr 22 16:05:40.596877 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.596866 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51f1a339-d1b2-447b-90ed-2943bebd1807-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7\" (UID: \"51f1a339-d1b2-447b-90ed-2943bebd1807\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" Apr 22 16:05:40.604247 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.604221 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h644\" (UniqueName: \"kubernetes.io/projected/51f1a339-d1b2-447b-90ed-2943bebd1807-kube-api-access-6h644\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7\" (UID: \"51f1a339-d1b2-447b-90ed-2943bebd1807\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" Apr 22 16:05:40.723000 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.722964 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" Apr 22 16:05:40.858376 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.858219 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7"] Apr 22 16:05:40.861402 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:05:40.861369 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51f1a339_d1b2_447b_90ed_2943bebd1807.slice/crio-e13ecb46f54fe916e5dab8d4a61889cb323a63976706c78103c906375b94fdac WatchSource:0}: Error finding container e13ecb46f54fe916e5dab8d4a61889cb323a63976706c78103c906375b94fdac: Status 404 returned error can't find the container with id e13ecb46f54fe916e5dab8d4a61889cb323a63976706c78103c906375b94fdac Apr 22 16:05:40.956919 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.956880 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" event={"ID":"51f1a339-d1b2-447b-90ed-2943bebd1807","Type":"ContainerStarted","Data":"7d186870d6d8ea900bf9a97b75133d744fb39cd90b4149d03124833dcc5aaefa"} Apr 22 16:05:40.957101 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:40.956928 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" event={"ID":"51f1a339-d1b2-447b-90ed-2943bebd1807","Type":"ContainerStarted","Data":"e13ecb46f54fe916e5dab8d4a61889cb323a63976706c78103c906375b94fdac"} Apr 22 16:05:41.961830 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:41.961794 2576 generic.go:358] "Generic (PLEG): container finished" podID="51f1a339-d1b2-447b-90ed-2943bebd1807" containerID="7d186870d6d8ea900bf9a97b75133d744fb39cd90b4149d03124833dcc5aaefa" exitCode=0 Apr 22 16:05:41.962307 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:41.961910 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" event={"ID":"51f1a339-d1b2-447b-90ed-2943bebd1807","Type":"ContainerDied","Data":"7d186870d6d8ea900bf9a97b75133d744fb39cd90b4149d03124833dcc5aaefa"} Apr 22 16:05:42.967772 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:42.967664 2576 generic.go:358] "Generic (PLEG): container finished" podID="51f1a339-d1b2-447b-90ed-2943bebd1807" containerID="843913cfec657b14a36b0a6e623cd2caf3bed78dd421103423a662537312c81d" exitCode=0 Apr 22 16:05:42.967772 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:42.967719 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" event={"ID":"51f1a339-d1b2-447b-90ed-2943bebd1807","Type":"ContainerDied","Data":"843913cfec657b14a36b0a6e623cd2caf3bed78dd421103423a662537312c81d"} Apr 22 16:05:43.973179 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:43.973136 2576 generic.go:358] "Generic (PLEG): container finished" podID="51f1a339-d1b2-447b-90ed-2943bebd1807" containerID="79d2573916e26bfdcb6f6ddc76b81a1d7ea53dd659e169ba66e7541d5255545f" exitCode=0 Apr 22 16:05:43.973609 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:43.973200 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" event={"ID":"51f1a339-d1b2-447b-90ed-2943bebd1807","Type":"ContainerDied","Data":"79d2573916e26bfdcb6f6ddc76b81a1d7ea53dd659e169ba66e7541d5255545f"} Apr 22 16:05:45.104727 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:45.104698 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" Apr 22 16:05:45.234967 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:45.234849 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51f1a339-d1b2-447b-90ed-2943bebd1807-util\") pod \"51f1a339-d1b2-447b-90ed-2943bebd1807\" (UID: \"51f1a339-d1b2-447b-90ed-2943bebd1807\") " Apr 22 16:05:45.234967 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:45.234910 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51f1a339-d1b2-447b-90ed-2943bebd1807-bundle\") pod \"51f1a339-d1b2-447b-90ed-2943bebd1807\" (UID: \"51f1a339-d1b2-447b-90ed-2943bebd1807\") " Apr 22 16:05:45.235190 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:45.234999 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h644\" (UniqueName: \"kubernetes.io/projected/51f1a339-d1b2-447b-90ed-2943bebd1807-kube-api-access-6h644\") pod \"51f1a339-d1b2-447b-90ed-2943bebd1807\" (UID: \"51f1a339-d1b2-447b-90ed-2943bebd1807\") " Apr 22 16:05:45.235670 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:45.235632 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51f1a339-d1b2-447b-90ed-2943bebd1807-bundle" (OuterVolumeSpecName: "bundle") pod "51f1a339-d1b2-447b-90ed-2943bebd1807" (UID: "51f1a339-d1b2-447b-90ed-2943bebd1807"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:05:45.237395 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:45.237363 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f1a339-d1b2-447b-90ed-2943bebd1807-kube-api-access-6h644" (OuterVolumeSpecName: "kube-api-access-6h644") pod "51f1a339-d1b2-447b-90ed-2943bebd1807" (UID: "51f1a339-d1b2-447b-90ed-2943bebd1807"). InnerVolumeSpecName "kube-api-access-6h644". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:05:45.241354 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:45.241313 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51f1a339-d1b2-447b-90ed-2943bebd1807-util" (OuterVolumeSpecName: "util") pod "51f1a339-d1b2-447b-90ed-2943bebd1807" (UID: "51f1a339-d1b2-447b-90ed-2943bebd1807"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:05:45.336202 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:45.336135 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6h644\" (UniqueName: \"kubernetes.io/projected/51f1a339-d1b2-447b-90ed-2943bebd1807-kube-api-access-6h644\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:05:45.336202 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:45.336192 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51f1a339-d1b2-447b-90ed-2943bebd1807-util\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:05:45.336202 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:45.336203 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51f1a339-d1b2-447b-90ed-2943bebd1807-bundle\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:05:45.989867 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:45.989814 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" event={"ID":"51f1a339-d1b2-447b-90ed-2943bebd1807","Type":"ContainerDied","Data":"e13ecb46f54fe916e5dab8d4a61889cb323a63976706c78103c906375b94fdac"} Apr 22 16:05:45.990181 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:45.990146 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e13ecb46f54fe916e5dab8d4a61889cb323a63976706c78103c906375b94fdac" Apr 22 16:05:45.990448 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:45.990435 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vbbt7" Apr 22 16:05:52.967917 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:52.967876 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8"] Apr 22 16:05:52.968908 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:52.968876 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51f1a339-d1b2-447b-90ed-2943bebd1807" containerName="pull" Apr 22 16:05:52.969115 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:52.969090 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f1a339-d1b2-447b-90ed-2943bebd1807" containerName="pull" Apr 22 16:05:52.969115 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:52.969116 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51f1a339-d1b2-447b-90ed-2943bebd1807" containerName="util" Apr 22 16:05:52.969305 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:52.969125 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f1a339-d1b2-447b-90ed-2943bebd1807" containerName="util" Apr 22 16:05:52.969305 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:52.969153 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51f1a339-d1b2-447b-90ed-2943bebd1807" containerName="extract" Apr 22 16:05:52.969305 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:52.969162 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f1a339-d1b2-447b-90ed-2943bebd1807" containerName="extract" Apr 22 16:05:52.969305 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:52.969261 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="51f1a339-d1b2-447b-90ed-2943bebd1807" containerName="extract" Apr 22 16:05:52.972203 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:52.972181 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8" Apr 22 16:05:52.975128 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:52.975102 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 16:05:52.975569 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:52.975553 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-pj4nw\"" Apr 22 16:05:52.976075 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:52.976055 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 16:05:52.976176 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:52.976154 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 16:05:52.976238 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:52.976195 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 16:05:52.992521 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:52.992477 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8"] Apr 22 16:05:53.000937 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:53.000905 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d85rf\" (UniqueName: \"kubernetes.io/projected/0edae815-51c6-4665-9eb8-b2908ac3053e-kube-api-access-d85rf\") pod \"opendatahub-operator-controller-manager-54dfb4598d-5v6q8\" (UID: \"0edae815-51c6-4665-9eb8-b2908ac3053e\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8" Apr 22 16:05:53.001111 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:53.000961 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0edae815-51c6-4665-9eb8-b2908ac3053e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-5v6q8\" (UID: \"0edae815-51c6-4665-9eb8-b2908ac3053e\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8" Apr 22 16:05:53.001111 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:53.001032 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0edae815-51c6-4665-9eb8-b2908ac3053e-webhook-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-5v6q8\" (UID: \"0edae815-51c6-4665-9eb8-b2908ac3053e\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8" Apr 22 16:05:53.101984 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:53.101943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d85rf\" (UniqueName: \"kubernetes.io/projected/0edae815-51c6-4665-9eb8-b2908ac3053e-kube-api-access-d85rf\") pod \"opendatahub-operator-controller-manager-54dfb4598d-5v6q8\" (UID: \"0edae815-51c6-4665-9eb8-b2908ac3053e\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8" Apr 22 16:05:53.102171 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:53.102010 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0edae815-51c6-4665-9eb8-b2908ac3053e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-5v6q8\" (UID: \"0edae815-51c6-4665-9eb8-b2908ac3053e\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8" Apr 22 16:05:53.102171 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:53.102068 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0edae815-51c6-4665-9eb8-b2908ac3053e-webhook-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-5v6q8\" (UID: \"0edae815-51c6-4665-9eb8-b2908ac3053e\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8" Apr 22 16:05:53.105034 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:53.104976 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0edae815-51c6-4665-9eb8-b2908ac3053e-webhook-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-5v6q8\" (UID: \"0edae815-51c6-4665-9eb8-b2908ac3053e\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8" Apr 22 16:05:53.105173 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:53.105111 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0edae815-51c6-4665-9eb8-b2908ac3053e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-5v6q8\" (UID: \"0edae815-51c6-4665-9eb8-b2908ac3053e\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8" Apr 22 16:05:53.110250 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:53.110226 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d85rf\" (UniqueName: \"kubernetes.io/projected/0edae815-51c6-4665-9eb8-b2908ac3053e-kube-api-access-d85rf\") pod \"opendatahub-operator-controller-manager-54dfb4598d-5v6q8\" (UID: \"0edae815-51c6-4665-9eb8-b2908ac3053e\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8" Apr 22 16:05:53.283375 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:53.283281 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8" Apr 22 16:05:53.434269 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:53.434209 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8"] Apr 22 16:05:53.437809 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:05:53.437766 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0edae815_51c6_4665_9eb8_b2908ac3053e.slice/crio-dbe8d7594f61c1383ad7646b646cf563e6288a87db9c8d285ee6d79e1179cdd3 WatchSource:0}: Error finding container dbe8d7594f61c1383ad7646b646cf563e6288a87db9c8d285ee6d79e1179cdd3: Status 404 returned error can't find the container with id dbe8d7594f61c1383ad7646b646cf563e6288a87db9c8d285ee6d79e1179cdd3 Apr 22 16:05:54.020592 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:54.020521 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8" event={"ID":"0edae815-51c6-4665-9eb8-b2908ac3053e","Type":"ContainerStarted","Data":"dbe8d7594f61c1383ad7646b646cf563e6288a87db9c8d285ee6d79e1179cdd3"} Apr 22 16:05:57.038967 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:57.038930 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8" event={"ID":"0edae815-51c6-4665-9eb8-b2908ac3053e","Type":"ContainerStarted","Data":"4d078c1c1bfedc7c2cc7c80a41b0675b7d83d60049722bccaef8cc6152021730"} Apr 22 16:05:57.039450 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:57.038991 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8" Apr 22 16:05:57.058217 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:05:57.058162 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8" podStartSLOduration=2.417089638 podStartE2EDuration="5.058144732s" podCreationTimestamp="2026-04-22 16:05:52 +0000 UTC" firstStartedPulling="2026-04-22 16:05:53.442125569 +0000 UTC m=+430.734314271" lastFinishedPulling="2026-04-22 16:05:56.083180661 +0000 UTC m=+433.375369365" observedRunningTime="2026-04-22 16:05:57.05697024 +0000 UTC m=+434.349158964" watchObservedRunningTime="2026-04-22 16:05:57.058144732 +0000 UTC m=+434.350333456" Apr 22 16:06:08.045191 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:08.045097 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-5v6q8" Apr 22 16:06:11.227322 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.227280 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln"] Apr 22 16:06:11.232458 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.232412 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" Apr 22 16:06:11.234035 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.233998 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab610e1a-c1b7-43ec-b499-389d356a8cac-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln\" (UID: \"ab610e1a-c1b7-43ec-b499-389d356a8cac\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" Apr 22 16:06:11.234196 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.234047 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab610e1a-c1b7-43ec-b499-389d356a8cac-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln\" (UID: \"ab610e1a-c1b7-43ec-b499-389d356a8cac\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" Apr 22 16:06:11.234196 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.234114 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zn5v\" (UniqueName: \"kubernetes.io/projected/ab610e1a-c1b7-43ec-b499-389d356a8cac-kube-api-access-7zn5v\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln\" (UID: \"ab610e1a-c1b7-43ec-b499-389d356a8cac\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" Apr 22 16:06:11.234573 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.234555 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 16:06:11.234662 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.234642 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 16:06:11.234767 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.234754 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-jvd26\"" Apr 22 16:06:11.242014 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.241988 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln"] Apr 22 16:06:11.328270 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.328237 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7fcf674f7-rsdpn"] Apr 22 16:06:11.330654 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.330634 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.333017 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.332977 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 16:06:11.333017 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.332974 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 16:06:11.333227 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.333127 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 16:06:11.333362 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.333343 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 16:06:11.333417 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.333377 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 16:06:11.333475 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.333440 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-6gvpr\"" Apr 22 16:06:11.334638 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.334617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab610e1a-c1b7-43ec-b499-389d356a8cac-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln\" (UID: \"ab610e1a-c1b7-43ec-b499-389d356a8cac\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" Apr 22 16:06:11.334712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.334652 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab610e1a-c1b7-43ec-b499-389d356a8cac-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln\" (UID: \"ab610e1a-c1b7-43ec-b499-389d356a8cac\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" Apr 22 16:06:11.334712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.334680 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zn5v\" (UniqueName: \"kubernetes.io/projected/ab610e1a-c1b7-43ec-b499-389d356a8cac-kube-api-access-7zn5v\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln\" (UID: \"ab610e1a-c1b7-43ec-b499-389d356a8cac\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" Apr 22 16:06:11.334712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.334706 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f2e4215-d989-4582-a4fd-50e024290fda-oauth-serving-cert\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.334848 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.334739 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f2e4215-d989-4582-a4fd-50e024290fda-trusted-ca-bundle\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.334848 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.334772 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb2zn\" (UniqueName: \"kubernetes.io/projected/4f2e4215-d989-4582-a4fd-50e024290fda-kube-api-access-mb2zn\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.334945 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.334864 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f2e4215-d989-4582-a4fd-50e024290fda-console-serving-cert\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.334945 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.334892 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f2e4215-d989-4582-a4fd-50e024290fda-console-oauth-config\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.334945 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.334921 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f2e4215-d989-4582-a4fd-50e024290fda-console-config\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.335086 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.335039 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab610e1a-c1b7-43ec-b499-389d356a8cac-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln\" (UID: \"ab610e1a-c1b7-43ec-b499-389d356a8cac\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" Apr 22 16:06:11.335086 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.335054 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f2e4215-d989-4582-a4fd-50e024290fda-service-ca\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.335086 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.335078 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab610e1a-c1b7-43ec-b499-389d356a8cac-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln\" (UID: \"ab610e1a-c1b7-43ec-b499-389d356a8cac\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" Apr 22 16:06:11.338235 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.338209 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 16:06:11.342768 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.342738 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7fcf674f7-rsdpn"] Apr 22 16:06:11.345196 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.345169 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zn5v\" (UniqueName: \"kubernetes.io/projected/ab610e1a-c1b7-43ec-b499-389d356a8cac-kube-api-access-7zn5v\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln\" (UID: \"ab610e1a-c1b7-43ec-b499-389d356a8cac\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" Apr 22 16:06:11.435632 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.435581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f2e4215-d989-4582-a4fd-50e024290fda-oauth-serving-cert\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.435632 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.435634 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f2e4215-d989-4582-a4fd-50e024290fda-trusted-ca-bundle\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.435911 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.435662 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mb2zn\" (UniqueName: \"kubernetes.io/projected/4f2e4215-d989-4582-a4fd-50e024290fda-kube-api-access-mb2zn\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.435911 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.435721 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f2e4215-d989-4582-a4fd-50e024290fda-console-serving-cert\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.435911 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.435740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f2e4215-d989-4582-a4fd-50e024290fda-console-oauth-config\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.435911 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.435768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f2e4215-d989-4582-a4fd-50e024290fda-console-config\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.436224 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.436178 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f2e4215-d989-4582-a4fd-50e024290fda-service-ca\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.437021 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.436894 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f2e4215-d989-4582-a4fd-50e024290fda-oauth-serving-cert\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.437175 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.437037 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f2e4215-d989-4582-a4fd-50e024290fda-console-config\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.437175 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.437101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f2e4215-d989-4582-a4fd-50e024290fda-trusted-ca-bundle\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.440270 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.437617 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f2e4215-d989-4582-a4fd-50e024290fda-service-ca\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.445318 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.445288 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f2e4215-d989-4582-a4fd-50e024290fda-console-oauth-config\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.445590 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.445567 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f2e4215-d989-4582-a4fd-50e024290fda-console-serving-cert\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.445725 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.445703 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb2zn\" (UniqueName: \"kubernetes.io/projected/4f2e4215-d989-4582-a4fd-50e024290fda-kube-api-access-mb2zn\") pod \"console-7fcf674f7-rsdpn\" (UID: \"4f2e4215-d989-4582-a4fd-50e024290fda\") " pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.544827 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.544724 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" Apr 22 16:06:11.643184 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.643141 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:11.684898 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.684752 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln"] Apr 22 16:06:11.688026 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:06:11.687992 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab610e1a_c1b7_43ec_b499_389d356a8cac.slice/crio-35ce1ebf953cfe80a8390b91be77c762640f465fb09d6387eb5df8e0acbc2352 WatchSource:0}: Error finding container 35ce1ebf953cfe80a8390b91be77c762640f465fb09d6387eb5df8e0acbc2352: Status 404 returned error can't find the container with id 35ce1ebf953cfe80a8390b91be77c762640f465fb09d6387eb5df8e0acbc2352 Apr 22 16:06:11.789583 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:11.789545 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7fcf674f7-rsdpn"] Apr 22 16:06:11.792929 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:06:11.792881 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f2e4215_d989_4582_a4fd_50e024290fda.slice/crio-fd7110c0915ff3190ba5f0cec4325d6b9f65b3af397584f38ef688c274f99b8c WatchSource:0}: Error finding container fd7110c0915ff3190ba5f0cec4325d6b9f65b3af397584f38ef688c274f99b8c: Status 404 returned error can't find the container with id fd7110c0915ff3190ba5f0cec4325d6b9f65b3af397584f38ef688c274f99b8c Apr 22 16:06:12.097038 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.096928 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-6c4b9b554-ldk8l"] Apr 22 16:06:12.098879 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.098850 2576 generic.go:358] "Generic (PLEG): container finished" podID="ab610e1a-c1b7-43ec-b499-389d356a8cac" containerID="63485ebe58eef49c42d482cee32e710485b65a496c07891d17349af19ac89c47" exitCode=0 Apr 22 16:06:12.099754 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.099733 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fcf674f7-rsdpn" event={"ID":"4f2e4215-d989-4582-a4fd-50e024290fda","Type":"ContainerStarted","Data":"421b11422e0a1b8e4f6ffda4802d7ca22c243fb1fe3c3eeec3ad4330343e6b49"} Apr 22 16:06:12.099837 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.099758 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fcf674f7-rsdpn" event={"ID":"4f2e4215-d989-4582-a4fd-50e024290fda","Type":"ContainerStarted","Data":"fd7110c0915ff3190ba5f0cec4325d6b9f65b3af397584f38ef688c274f99b8c"} Apr 22 16:06:12.099837 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.099769 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" event={"ID":"ab610e1a-c1b7-43ec-b499-389d356a8cac","Type":"ContainerDied","Data":"63485ebe58eef49c42d482cee32e710485b65a496c07891d17349af19ac89c47"} Apr 22 16:06:12.099837 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.099781 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" event={"ID":"ab610e1a-c1b7-43ec-b499-389d356a8cac","Type":"ContainerStarted","Data":"35ce1ebf953cfe80a8390b91be77c762640f465fb09d6387eb5df8e0acbc2352"} Apr 22 16:06:12.099993 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.099890 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6c4b9b554-ldk8l" Apr 22 16:06:12.101957 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.101928 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 22 16:06:12.101957 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.101943 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-kdnln\"" Apr 22 16:06:12.102087 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.102013 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 22 16:06:12.109798 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.109771 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6c4b9b554-ldk8l"] Apr 22 16:06:12.120848 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.120781 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7fcf674f7-rsdpn" podStartSLOduration=1.120761986 podStartE2EDuration="1.120761986s" podCreationTimestamp="2026-04-22 16:06:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:06:12.12024184 +0000 UTC m=+449.412430577" watchObservedRunningTime="2026-04-22 16:06:12.120761986 +0000 UTC m=+449.412950712" Apr 22 16:06:12.141148 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.141105 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/75257664-1fa7-401a-9046-9a756b9e9335-tls-certs\") pod \"kube-auth-proxy-6c4b9b554-ldk8l\" (UID: \"75257664-1fa7-401a-9046-9a756b9e9335\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-ldk8l" Apr 22 16:06:12.141363 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.141218 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm5x7\" (UniqueName: \"kubernetes.io/projected/75257664-1fa7-401a-9046-9a756b9e9335-kube-api-access-mm5x7\") pod \"kube-auth-proxy-6c4b9b554-ldk8l\" (UID: \"75257664-1fa7-401a-9046-9a756b9e9335\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-ldk8l" Apr 22 16:06:12.141363 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.141334 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/75257664-1fa7-401a-9046-9a756b9e9335-tmp\") pod \"kube-auth-proxy-6c4b9b554-ldk8l\" (UID: \"75257664-1fa7-401a-9046-9a756b9e9335\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-ldk8l" Apr 22 16:06:12.242446 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.242399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/75257664-1fa7-401a-9046-9a756b9e9335-tls-certs\") pod \"kube-auth-proxy-6c4b9b554-ldk8l\" (UID: \"75257664-1fa7-401a-9046-9a756b9e9335\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-ldk8l" Apr 22 16:06:12.242897 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.242492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mm5x7\" (UniqueName: \"kubernetes.io/projected/75257664-1fa7-401a-9046-9a756b9e9335-kube-api-access-mm5x7\") pod \"kube-auth-proxy-6c4b9b554-ldk8l\" (UID: \"75257664-1fa7-401a-9046-9a756b9e9335\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-ldk8l" Apr 22 16:06:12.242897 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.242562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/75257664-1fa7-401a-9046-9a756b9e9335-tmp\") pod \"kube-auth-proxy-6c4b9b554-ldk8l\" (UID: \"75257664-1fa7-401a-9046-9a756b9e9335\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-ldk8l" Apr 22 16:06:12.245107 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.245063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/75257664-1fa7-401a-9046-9a756b9e9335-tmp\") pod \"kube-auth-proxy-6c4b9b554-ldk8l\" (UID: \"75257664-1fa7-401a-9046-9a756b9e9335\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-ldk8l" Apr 22 16:06:12.245285 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.245262 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/75257664-1fa7-401a-9046-9a756b9e9335-tls-certs\") pod \"kube-auth-proxy-6c4b9b554-ldk8l\" (UID: \"75257664-1fa7-401a-9046-9a756b9e9335\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-ldk8l" Apr 22 16:06:12.250434 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.250398 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm5x7\" (UniqueName: \"kubernetes.io/projected/75257664-1fa7-401a-9046-9a756b9e9335-kube-api-access-mm5x7\") pod \"kube-auth-proxy-6c4b9b554-ldk8l\" (UID: \"75257664-1fa7-401a-9046-9a756b9e9335\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-ldk8l" Apr 22 16:06:12.412035 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.411932 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6c4b9b554-ldk8l" Apr 22 16:06:12.548560 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:12.548245 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6c4b9b554-ldk8l"] Apr 22 16:06:12.551245 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:06:12.551211 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75257664_1fa7_401a_9046_9a756b9e9335.slice/crio-f36936f4ed3c4c8d16eddb562bc0779da1b59ba251bd0587ecce03b9744e9552 WatchSource:0}: Error finding container f36936f4ed3c4c8d16eddb562bc0779da1b59ba251bd0587ecce03b9744e9552: Status 404 returned error can't find the container with id f36936f4ed3c4c8d16eddb562bc0779da1b59ba251bd0587ecce03b9744e9552 Apr 22 16:06:13.103959 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:13.103923 2576 generic.go:358] "Generic (PLEG): container finished" podID="ab610e1a-c1b7-43ec-b499-389d356a8cac" containerID="89f626e8b18ba4a93f669b6abfe1e7b88921724335081da8673ad1323329328e" exitCode=0 Apr 22 16:06:13.104172 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:13.103977 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" event={"ID":"ab610e1a-c1b7-43ec-b499-389d356a8cac","Type":"ContainerDied","Data":"89f626e8b18ba4a93f669b6abfe1e7b88921724335081da8673ad1323329328e"} Apr 22 16:06:13.105161 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:13.105132 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6c4b9b554-ldk8l" event={"ID":"75257664-1fa7-401a-9046-9a756b9e9335","Type":"ContainerStarted","Data":"f36936f4ed3c4c8d16eddb562bc0779da1b59ba251bd0587ecce03b9744e9552"} Apr 22 16:06:14.112172 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:14.112135 2576 generic.go:358] "Generic (PLEG): container finished" podID="ab610e1a-c1b7-43ec-b499-389d356a8cac" containerID="6ac6b108be33fcaa4c42e7ebdfd2495ff98f31e84a756c5451fcf354fd4da1bb" exitCode=0 Apr 22 16:06:14.112628 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:14.112178 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" event={"ID":"ab610e1a-c1b7-43ec-b499-389d356a8cac","Type":"ContainerDied","Data":"6ac6b108be33fcaa4c42e7ebdfd2495ff98f31e84a756c5451fcf354fd4da1bb"} Apr 22 16:06:15.254647 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.254617 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" Apr 22 16:06:15.273383 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.273336 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zn5v\" (UniqueName: \"kubernetes.io/projected/ab610e1a-c1b7-43ec-b499-389d356a8cac-kube-api-access-7zn5v\") pod \"ab610e1a-c1b7-43ec-b499-389d356a8cac\" (UID: \"ab610e1a-c1b7-43ec-b499-389d356a8cac\") " Apr 22 16:06:15.273607 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.273417 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab610e1a-c1b7-43ec-b499-389d356a8cac-bundle\") pod \"ab610e1a-c1b7-43ec-b499-389d356a8cac\" (UID: \"ab610e1a-c1b7-43ec-b499-389d356a8cac\") " Apr 22 16:06:15.273607 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.273503 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab610e1a-c1b7-43ec-b499-389d356a8cac-util\") pod \"ab610e1a-c1b7-43ec-b499-389d356a8cac\" (UID: \"ab610e1a-c1b7-43ec-b499-389d356a8cac\") " Apr 22 16:06:15.274746 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.274705 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab610e1a-c1b7-43ec-b499-389d356a8cac-bundle" (OuterVolumeSpecName: "bundle") pod "ab610e1a-c1b7-43ec-b499-389d356a8cac" (UID: "ab610e1a-c1b7-43ec-b499-389d356a8cac"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:06:15.276332 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.276299 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab610e1a-c1b7-43ec-b499-389d356a8cac-kube-api-access-7zn5v" (OuterVolumeSpecName: "kube-api-access-7zn5v") pod "ab610e1a-c1b7-43ec-b499-389d356a8cac" (UID: "ab610e1a-c1b7-43ec-b499-389d356a8cac"). InnerVolumeSpecName "kube-api-access-7zn5v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:06:15.279464 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.279424 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab610e1a-c1b7-43ec-b499-389d356a8cac-util" (OuterVolumeSpecName: "util") pod "ab610e1a-c1b7-43ec-b499-389d356a8cac" (UID: "ab610e1a-c1b7-43ec-b499-389d356a8cac"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:06:15.298019 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.297980 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-5krlx"] Apr 22 16:06:15.298420 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.298405 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab610e1a-c1b7-43ec-b499-389d356a8cac" containerName="util" Apr 22 16:06:15.298420 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.298422 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab610e1a-c1b7-43ec-b499-389d356a8cac" containerName="util" Apr 22 16:06:15.298577 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.298437 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab610e1a-c1b7-43ec-b499-389d356a8cac" containerName="extract" Apr 22 16:06:15.298577 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.298442 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab610e1a-c1b7-43ec-b499-389d356a8cac" containerName="extract" Apr 22 16:06:15.298577 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.298456 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab610e1a-c1b7-43ec-b499-389d356a8cac" containerName="pull" Apr 22 16:06:15.298577 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.298465 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab610e1a-c1b7-43ec-b499-389d356a8cac" containerName="pull" Apr 22 16:06:15.298577 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.298551 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab610e1a-c1b7-43ec-b499-389d356a8cac" containerName="extract" Apr 22 16:06:15.301306 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.301287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-5krlx" Apr 22 16:06:15.303485 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.303455 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 22 16:06:15.303614 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.303468 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-xbh9w\"" Apr 22 16:06:15.311552 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.309682 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-5krlx"] Apr 22 16:06:15.374599 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.374466 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scfgm\" (UniqueName: \"kubernetes.io/projected/a5f2b79a-f0e6-447c-b3df-87cc295c9033-kube-api-access-scfgm\") pod \"odh-model-controller-858dbf95b8-5krlx\" (UID: \"a5f2b79a-f0e6-447c-b3df-87cc295c9033\") " pod="opendatahub/odh-model-controller-858dbf95b8-5krlx" Apr 22 16:06:15.374599 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.374599 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5f2b79a-f0e6-447c-b3df-87cc295c9033-cert\") pod \"odh-model-controller-858dbf95b8-5krlx\" (UID: \"a5f2b79a-f0e6-447c-b3df-87cc295c9033\") " pod="opendatahub/odh-model-controller-858dbf95b8-5krlx" Apr 22 16:06:15.374808 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.374640 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab610e1a-c1b7-43ec-b499-389d356a8cac-util\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:06:15.374808 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.374651 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7zn5v\" (UniqueName: \"kubernetes.io/projected/ab610e1a-c1b7-43ec-b499-389d356a8cac-kube-api-access-7zn5v\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:06:15.374808 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.374663 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab610e1a-c1b7-43ec-b499-389d356a8cac-bundle\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:06:15.475893 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.475849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scfgm\" (UniqueName: \"kubernetes.io/projected/a5f2b79a-f0e6-447c-b3df-87cc295c9033-kube-api-access-scfgm\") pod \"odh-model-controller-858dbf95b8-5krlx\" (UID: \"a5f2b79a-f0e6-447c-b3df-87cc295c9033\") " pod="opendatahub/odh-model-controller-858dbf95b8-5krlx" Apr 22 16:06:15.476113 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.476089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5f2b79a-f0e6-447c-b3df-87cc295c9033-cert\") pod \"odh-model-controller-858dbf95b8-5krlx\" (UID: \"a5f2b79a-f0e6-447c-b3df-87cc295c9033\") " pod="opendatahub/odh-model-controller-858dbf95b8-5krlx" Apr 22 16:06:15.476256 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:06:15.476237 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 16:06:15.476330 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:06:15.476318 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5f2b79a-f0e6-447c-b3df-87cc295c9033-cert podName:a5f2b79a-f0e6-447c-b3df-87cc295c9033 nodeName:}" failed. No retries permitted until 2026-04-22 16:06:15.976292619 +0000 UTC m=+453.268481329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5f2b79a-f0e6-447c-b3df-87cc295c9033-cert") pod "odh-model-controller-858dbf95b8-5krlx" (UID: "a5f2b79a-f0e6-447c-b3df-87cc295c9033") : secret "odh-model-controller-webhook-cert" not found Apr 22 16:06:15.486545 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.486504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scfgm\" (UniqueName: \"kubernetes.io/projected/a5f2b79a-f0e6-447c-b3df-87cc295c9033-kube-api-access-scfgm\") pod \"odh-model-controller-858dbf95b8-5krlx\" (UID: \"a5f2b79a-f0e6-447c-b3df-87cc295c9033\") " pod="opendatahub/odh-model-controller-858dbf95b8-5krlx" Apr 22 16:06:15.981414 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:15.981369 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5f2b79a-f0e6-447c-b3df-87cc295c9033-cert\") pod \"odh-model-controller-858dbf95b8-5krlx\" (UID: \"a5f2b79a-f0e6-447c-b3df-87cc295c9033\") " pod="opendatahub/odh-model-controller-858dbf95b8-5krlx" Apr 22 16:06:15.981655 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:06:15.981633 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 16:06:15.981727 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:06:15.981720 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5f2b79a-f0e6-447c-b3df-87cc295c9033-cert podName:a5f2b79a-f0e6-447c-b3df-87cc295c9033 nodeName:}" failed. No retries permitted until 2026-04-22 16:06:16.981695697 +0000 UTC m=+454.273884404 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5f2b79a-f0e6-447c-b3df-87cc295c9033-cert") pod "odh-model-controller-858dbf95b8-5krlx" (UID: "a5f2b79a-f0e6-447c-b3df-87cc295c9033") : secret "odh-model-controller-webhook-cert" not found Apr 22 16:06:16.122420 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:16.122377 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" event={"ID":"ab610e1a-c1b7-43ec-b499-389d356a8cac","Type":"ContainerDied","Data":"35ce1ebf953cfe80a8390b91be77c762640f465fb09d6387eb5df8e0acbc2352"} Apr 22 16:06:16.122420 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:16.122412 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835thtln" Apr 22 16:06:16.122694 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:16.122421 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35ce1ebf953cfe80a8390b91be77c762640f465fb09d6387eb5df8e0acbc2352" Apr 22 16:06:16.991770 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:16.991731 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5f2b79a-f0e6-447c-b3df-87cc295c9033-cert\") pod \"odh-model-controller-858dbf95b8-5krlx\" (UID: \"a5f2b79a-f0e6-447c-b3df-87cc295c9033\") " pod="opendatahub/odh-model-controller-858dbf95b8-5krlx" Apr 22 16:06:16.994550 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:16.994501 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5f2b79a-f0e6-447c-b3df-87cc295c9033-cert\") pod \"odh-model-controller-858dbf95b8-5krlx\" (UID: \"a5f2b79a-f0e6-447c-b3df-87cc295c9033\") " pod="opendatahub/odh-model-controller-858dbf95b8-5krlx" Apr 22 16:06:17.128385 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:17.128347 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6c4b9b554-ldk8l" event={"ID":"75257664-1fa7-401a-9046-9a756b9e9335","Type":"ContainerStarted","Data":"6ee970fefe4f6226e8c4526068139cdaed9affbb7be702b62ef1a768e09325e7"} Apr 22 16:06:17.136139 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:17.136105 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-5krlx" Apr 22 16:06:17.145102 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:17.145037 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-6c4b9b554-ldk8l" podStartSLOduration=1.361373935 podStartE2EDuration="5.145020667s" podCreationTimestamp="2026-04-22 16:06:12 +0000 UTC" firstStartedPulling="2026-04-22 16:06:12.553277252 +0000 UTC m=+449.845465955" lastFinishedPulling="2026-04-22 16:06:16.336923985 +0000 UTC m=+453.629112687" observedRunningTime="2026-04-22 16:06:17.142340284 +0000 UTC m=+454.434529008" watchObservedRunningTime="2026-04-22 16:06:17.145020667 +0000 UTC m=+454.437209392" Apr 22 16:06:17.273193 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:17.273129 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-5krlx"] Apr 22 16:06:17.275958 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:06:17.275909 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5f2b79a_f0e6_447c_b3df_87cc295c9033.slice/crio-3ed71ac922b6b27564c6e146bb9b61a40a267ace694a6926e971bd52bfaf3839 WatchSource:0}: Error finding container 3ed71ac922b6b27564c6e146bb9b61a40a267ace694a6926e971bd52bfaf3839: Status 404 returned error can't find the container with id 3ed71ac922b6b27564c6e146bb9b61a40a267ace694a6926e971bd52bfaf3839 Apr 22 16:06:18.133553 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:18.133498 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-5krlx" event={"ID":"a5f2b79a-f0e6-447c-b3df-87cc295c9033","Type":"ContainerStarted","Data":"3ed71ac922b6b27564c6e146bb9b61a40a267ace694a6926e971bd52bfaf3839"} Apr 22 16:06:20.755521 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:20.755471 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-tvp8f"] Apr 22 16:06:20.758313 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:20.758283 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-tvp8f" Apr 22 16:06:20.760818 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:20.760790 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 22 16:06:20.760996 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:20.760800 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-xz69h\"" Apr 22 16:06:20.766890 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:20.766859 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-tvp8f"] Apr 22 16:06:20.829106 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:20.829053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f6f0783-eae9-4ae1-92ed-e4430af515bf-cert\") pod \"kserve-controller-manager-856948b99f-tvp8f\" (UID: \"4f6f0783-eae9-4ae1-92ed-e4430af515bf\") " pod="opendatahub/kserve-controller-manager-856948b99f-tvp8f" Apr 22 16:06:20.829306 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:20.829226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkj65\" (UniqueName: \"kubernetes.io/projected/4f6f0783-eae9-4ae1-92ed-e4430af515bf-kube-api-access-rkj65\") pod \"kserve-controller-manager-856948b99f-tvp8f\" (UID: \"4f6f0783-eae9-4ae1-92ed-e4430af515bf\") " pod="opendatahub/kserve-controller-manager-856948b99f-tvp8f" Apr 22 16:06:20.930613 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:20.930556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f6f0783-eae9-4ae1-92ed-e4430af515bf-cert\") pod \"kserve-controller-manager-856948b99f-tvp8f\" (UID: \"4f6f0783-eae9-4ae1-92ed-e4430af515bf\") " pod="opendatahub/kserve-controller-manager-856948b99f-tvp8f" Apr 22 16:06:20.930785 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:20.930701 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkj65\" (UniqueName: \"kubernetes.io/projected/4f6f0783-eae9-4ae1-92ed-e4430af515bf-kube-api-access-rkj65\") pod \"kserve-controller-manager-856948b99f-tvp8f\" (UID: \"4f6f0783-eae9-4ae1-92ed-e4430af515bf\") " pod="opendatahub/kserve-controller-manager-856948b99f-tvp8f" Apr 22 16:06:20.931314 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:06:20.930979 2576 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 22 16:06:20.931314 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:06:20.931076 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f6f0783-eae9-4ae1-92ed-e4430af515bf-cert podName:4f6f0783-eae9-4ae1-92ed-e4430af515bf nodeName:}" failed. No retries permitted until 2026-04-22 16:06:21.431054462 +0000 UTC m=+458.723243179 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f6f0783-eae9-4ae1-92ed-e4430af515bf-cert") pod "kserve-controller-manager-856948b99f-tvp8f" (UID: "4f6f0783-eae9-4ae1-92ed-e4430af515bf") : secret "kserve-webhook-server-cert" not found Apr 22 16:06:20.940728 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:20.940691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkj65\" (UniqueName: \"kubernetes.io/projected/4f6f0783-eae9-4ae1-92ed-e4430af515bf-kube-api-access-rkj65\") pod \"kserve-controller-manager-856948b99f-tvp8f\" (UID: \"4f6f0783-eae9-4ae1-92ed-e4430af515bf\") " pod="opendatahub/kserve-controller-manager-856948b99f-tvp8f" Apr 22 16:06:21.436994 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:21.436950 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f6f0783-eae9-4ae1-92ed-e4430af515bf-cert\") pod \"kserve-controller-manager-856948b99f-tvp8f\" (UID: \"4f6f0783-eae9-4ae1-92ed-e4430af515bf\") " pod="opendatahub/kserve-controller-manager-856948b99f-tvp8f" Apr 22 16:06:21.439646 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:21.439617 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f6f0783-eae9-4ae1-92ed-e4430af515bf-cert\") pod \"kserve-controller-manager-856948b99f-tvp8f\" (UID: \"4f6f0783-eae9-4ae1-92ed-e4430af515bf\") " pod="opendatahub/kserve-controller-manager-856948b99f-tvp8f" Apr 22 16:06:21.644377 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:21.644269 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:21.644377 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:21.644349 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:21.649498 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:21.649459 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:21.674160 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:21.674120 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-tvp8f" Apr 22 16:06:21.818719 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:21.818673 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-tvp8f"] Apr 22 16:06:21.822765 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:06:21.822733 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f6f0783_eae9_4ae1_92ed_e4430af515bf.slice/crio-0b57229ac82b72100fed20c44788a8f6588ebc763741366b066c051e1fb0e6e8 WatchSource:0}: Error finding container 0b57229ac82b72100fed20c44788a8f6588ebc763741366b066c051e1fb0e6e8: Status 404 returned error can't find the container with id 0b57229ac82b72100fed20c44788a8f6588ebc763741366b066c051e1fb0e6e8 Apr 22 16:06:22.151559 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:22.151501 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-5krlx" event={"ID":"a5f2b79a-f0e6-447c-b3df-87cc295c9033","Type":"ContainerStarted","Data":"50fd228457a0ffe0f2ae798d4691dfec47cd0a85bd5a381ad75a7243fd7fb5de"} Apr 22 16:06:22.151794 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:22.151774 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-5krlx" Apr 22 16:06:22.152779 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:22.152750 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-tvp8f" event={"ID":"4f6f0783-eae9-4ae1-92ed-e4430af515bf","Type":"ContainerStarted","Data":"0b57229ac82b72100fed20c44788a8f6588ebc763741366b066c051e1fb0e6e8"} Apr 22 16:06:22.156608 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:22.156584 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7fcf674f7-rsdpn" Apr 22 16:06:22.169001 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:22.168945 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-5krlx" podStartSLOduration=3.207273982 podStartE2EDuration="7.168931738s" podCreationTimestamp="2026-04-22 16:06:15 +0000 UTC" firstStartedPulling="2026-04-22 16:06:17.277339686 +0000 UTC m=+454.569528388" lastFinishedPulling="2026-04-22 16:06:21.238997438 +0000 UTC m=+458.531186144" observedRunningTime="2026-04-22 16:06:22.16675081 +0000 UTC m=+459.458939547" watchObservedRunningTime="2026-04-22 16:06:22.168931738 +0000 UTC m=+459.461120463" Apr 22 16:06:25.168233 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.168177 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-tvp8f" event={"ID":"4f6f0783-eae9-4ae1-92ed-e4430af515bf","Type":"ContainerStarted","Data":"ef789644c5480ddac769bdc1e2f406759a90b04ef52b2855adf37359c2b5ad16"} Apr 22 16:06:25.168862 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.168432 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-tvp8f" Apr 22 16:06:25.190310 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.190246 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-tvp8f" podStartSLOduration=2.602038988 podStartE2EDuration="5.190226859s" podCreationTimestamp="2026-04-22 16:06:20 +0000 UTC" firstStartedPulling="2026-04-22 16:06:21.8244508 +0000 UTC m=+459.116639501" lastFinishedPulling="2026-04-22 16:06:24.412638659 +0000 UTC m=+461.704827372" observedRunningTime="2026-04-22 16:06:25.188774947 +0000 UTC m=+462.480963682" watchObservedRunningTime="2026-04-22 16:06:25.190226859 +0000 UTC m=+462.482415607" Apr 22 16:06:25.377941 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.377901 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49"] Apr 22 16:06:25.381444 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.381409 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" Apr 22 16:06:25.383960 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.383924 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 16:06:25.384172 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.383990 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 16:06:25.384172 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.384103 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-jvd26\"" Apr 22 16:06:25.398855 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.398820 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49"] Apr 22 16:06:25.474286 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.474238 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/adbd557b-1dee-4986-93f0-f4289c130638-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49\" (UID: \"adbd557b-1dee-4986-93f0-f4289c130638\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" Apr 22 16:06:25.474497 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.474357 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/adbd557b-1dee-4986-93f0-f4289c130638-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49\" (UID: \"adbd557b-1dee-4986-93f0-f4289c130638\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" Apr 22 16:06:25.474497 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.474403 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6pd4\" (UniqueName: \"kubernetes.io/projected/adbd557b-1dee-4986-93f0-f4289c130638-kube-api-access-f6pd4\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49\" (UID: \"adbd557b-1dee-4986-93f0-f4289c130638\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" Apr 22 16:06:25.575575 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.575520 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/adbd557b-1dee-4986-93f0-f4289c130638-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49\" (UID: \"adbd557b-1dee-4986-93f0-f4289c130638\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" Apr 22 16:06:25.575575 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.575580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6pd4\" (UniqueName: \"kubernetes.io/projected/adbd557b-1dee-4986-93f0-f4289c130638-kube-api-access-f6pd4\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49\" (UID: \"adbd557b-1dee-4986-93f0-f4289c130638\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" Apr 22 16:06:25.575827 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.575637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/adbd557b-1dee-4986-93f0-f4289c130638-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49\" (UID: \"adbd557b-1dee-4986-93f0-f4289c130638\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" Apr 22 16:06:25.576037 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.576004 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/adbd557b-1dee-4986-93f0-f4289c130638-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49\" (UID: \"adbd557b-1dee-4986-93f0-f4289c130638\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" Apr 22 16:06:25.576166 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.576041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/adbd557b-1dee-4986-93f0-f4289c130638-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49\" (UID: \"adbd557b-1dee-4986-93f0-f4289c130638\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" Apr 22 16:06:25.590243 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.590196 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6pd4\" (UniqueName: \"kubernetes.io/projected/adbd557b-1dee-4986-93f0-f4289c130638-kube-api-access-f6pd4\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49\" (UID: \"adbd557b-1dee-4986-93f0-f4289c130638\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" Apr 22 16:06:25.695419 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.695377 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" Apr 22 16:06:25.861035 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:25.861007 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49"] Apr 22 16:06:25.863885 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:06:25.863833 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadbd557b_1dee_4986_93f0_f4289c130638.slice/crio-a7d0407989f3d38c89a21b57aac50ead9df8858d95eb9265ea05ebd65813f5f5 WatchSource:0}: Error finding container a7d0407989f3d38c89a21b57aac50ead9df8858d95eb9265ea05ebd65813f5f5: Status 404 returned error can't find the container with id a7d0407989f3d38c89a21b57aac50ead9df8858d95eb9265ea05ebd65813f5f5 Apr 22 16:06:26.174951 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:26.174851 2576 generic.go:358] "Generic (PLEG): container finished" podID="adbd557b-1dee-4986-93f0-f4289c130638" containerID="ef5f91ca8b82553c836325d0e9ffca2c731d980fd6f099976a72799484d5533d" exitCode=0 Apr 22 16:06:26.174951 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:26.174925 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" event={"ID":"adbd557b-1dee-4986-93f0-f4289c130638","Type":"ContainerDied","Data":"ef5f91ca8b82553c836325d0e9ffca2c731d980fd6f099976a72799484d5533d"} Apr 22 16:06:26.175445 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:26.174967 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" event={"ID":"adbd557b-1dee-4986-93f0-f4289c130638","Type":"ContainerStarted","Data":"a7d0407989f3d38c89a21b57aac50ead9df8858d95eb9265ea05ebd65813f5f5"} Apr 22 16:06:26.658407 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:26.658370 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-hlcnl"] Apr 22 16:06:26.660943 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:26.660919 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hlcnl" Apr 22 16:06:26.663550 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:26.663504 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-6ng9j\"" Apr 22 16:06:26.663711 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:26.663627 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 22 16:06:26.664013 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:26.663992 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 22 16:06:26.674685 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:26.674651 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-hlcnl"] Apr 22 16:06:26.786836 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:26.786787 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/03a882d1-1626-473b-b45b-5633f1d10090-operator-config\") pod \"servicemesh-operator3-55f49c5f94-hlcnl\" (UID: \"03a882d1-1626-473b-b45b-5633f1d10090\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hlcnl" Apr 22 16:06:26.787005 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:26.786846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hrmf\" (UniqueName: \"kubernetes.io/projected/03a882d1-1626-473b-b45b-5633f1d10090-kube-api-access-5hrmf\") pod \"servicemesh-operator3-55f49c5f94-hlcnl\" (UID: \"03a882d1-1626-473b-b45b-5633f1d10090\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hlcnl" Apr 22 16:06:26.887646 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:26.887546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/03a882d1-1626-473b-b45b-5633f1d10090-operator-config\") pod \"servicemesh-operator3-55f49c5f94-hlcnl\" (UID: \"03a882d1-1626-473b-b45b-5633f1d10090\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hlcnl" Apr 22 16:06:26.887646 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:26.887626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hrmf\" (UniqueName: \"kubernetes.io/projected/03a882d1-1626-473b-b45b-5633f1d10090-kube-api-access-5hrmf\") pod \"servicemesh-operator3-55f49c5f94-hlcnl\" (UID: \"03a882d1-1626-473b-b45b-5633f1d10090\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hlcnl" Apr 22 16:06:26.890383 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:26.890359 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/03a882d1-1626-473b-b45b-5633f1d10090-operator-config\") pod \"servicemesh-operator3-55f49c5f94-hlcnl\" (UID: \"03a882d1-1626-473b-b45b-5633f1d10090\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hlcnl" Apr 22 16:06:26.895878 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:26.895850 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hrmf\" (UniqueName: \"kubernetes.io/projected/03a882d1-1626-473b-b45b-5633f1d10090-kube-api-access-5hrmf\") pod \"servicemesh-operator3-55f49c5f94-hlcnl\" (UID: \"03a882d1-1626-473b-b45b-5633f1d10090\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hlcnl" Apr 22 16:06:26.972049 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:26.972000 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hlcnl" Apr 22 16:06:27.128520 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:27.128482 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-hlcnl"] Apr 22 16:06:27.136070 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:06:27.136030 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03a882d1_1626_473b_b45b_5633f1d10090.slice/crio-e6ddea13e7c5c396ab93885480e86ffa55d37574b180913d5199677ec7e052ae WatchSource:0}: Error finding container e6ddea13e7c5c396ab93885480e86ffa55d37574b180913d5199677ec7e052ae: Status 404 returned error can't find the container with id e6ddea13e7c5c396ab93885480e86ffa55d37574b180913d5199677ec7e052ae Apr 22 16:06:27.179663 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:27.179624 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hlcnl" event={"ID":"03a882d1-1626-473b-b45b-5633f1d10090","Type":"ContainerStarted","Data":"e6ddea13e7c5c396ab93885480e86ffa55d37574b180913d5199677ec7e052ae"} Apr 22 16:06:28.186602 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:28.186554 2576 generic.go:358] "Generic (PLEG): container finished" podID="adbd557b-1dee-4986-93f0-f4289c130638" containerID="0790ddf824ad544cc897164da8d264c33826be94e92a8903b43f706235618c72" exitCode=0 Apr 22 16:06:28.187100 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:28.186629 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" event={"ID":"adbd557b-1dee-4986-93f0-f4289c130638","Type":"ContainerDied","Data":"0790ddf824ad544cc897164da8d264c33826be94e92a8903b43f706235618c72"} Apr 22 16:06:29.194340 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:29.194301 2576 generic.go:358] "Generic (PLEG): container finished" podID="adbd557b-1dee-4986-93f0-f4289c130638" containerID="470aa759abb339469d531a32efc933c0ae300fa3e4a6051d23489603b800b0dc" exitCode=0 Apr 22 16:06:29.194830 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:29.194371 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" event={"ID":"adbd557b-1dee-4986-93f0-f4289c130638","Type":"ContainerDied","Data":"470aa759abb339469d531a32efc933c0ae300fa3e4a6051d23489603b800b0dc"} Apr 22 16:06:30.200899 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:30.200850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hlcnl" event={"ID":"03a882d1-1626-473b-b45b-5633f1d10090","Type":"ContainerStarted","Data":"149b5a7153ac4d3ddd4e48e511b1141454877cbc179764e8477c6920056717a5"} Apr 22 16:06:30.201399 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:30.201061 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hlcnl" Apr 22 16:06:30.221681 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:30.221499 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hlcnl" podStartSLOduration=1.378194692 podStartE2EDuration="4.221472703s" podCreationTimestamp="2026-04-22 16:06:26 +0000 UTC" firstStartedPulling="2026-04-22 16:06:27.138791301 +0000 UTC m=+464.430980002" lastFinishedPulling="2026-04-22 16:06:29.98206931 +0000 UTC m=+467.274258013" observedRunningTime="2026-04-22 16:06:30.217787677 +0000 UTC m=+467.509976401" watchObservedRunningTime="2026-04-22 16:06:30.221472703 +0000 UTC m=+467.513661431" Apr 22 16:06:30.374319 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:30.374292 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" Apr 22 16:06:30.429693 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:30.429654 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/adbd557b-1dee-4986-93f0-f4289c130638-util\") pod \"adbd557b-1dee-4986-93f0-f4289c130638\" (UID: \"adbd557b-1dee-4986-93f0-f4289c130638\") " Apr 22 16:06:30.429901 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:30.429722 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/adbd557b-1dee-4986-93f0-f4289c130638-bundle\") pod \"adbd557b-1dee-4986-93f0-f4289c130638\" (UID: \"adbd557b-1dee-4986-93f0-f4289c130638\") " Apr 22 16:06:30.429901 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:30.429784 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6pd4\" (UniqueName: \"kubernetes.io/projected/adbd557b-1dee-4986-93f0-f4289c130638-kube-api-access-f6pd4\") pod \"adbd557b-1dee-4986-93f0-f4289c130638\" (UID: \"adbd557b-1dee-4986-93f0-f4289c130638\") " Apr 22 16:06:30.431287 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:30.431231 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adbd557b-1dee-4986-93f0-f4289c130638-bundle" (OuterVolumeSpecName: "bundle") pod "adbd557b-1dee-4986-93f0-f4289c130638" (UID: "adbd557b-1dee-4986-93f0-f4289c130638"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:06:30.432346 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:30.432309 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adbd557b-1dee-4986-93f0-f4289c130638-kube-api-access-f6pd4" (OuterVolumeSpecName: "kube-api-access-f6pd4") pod "adbd557b-1dee-4986-93f0-f4289c130638" (UID: "adbd557b-1dee-4986-93f0-f4289c130638"). InnerVolumeSpecName "kube-api-access-f6pd4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:06:30.438307 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:30.438246 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adbd557b-1dee-4986-93f0-f4289c130638-util" (OuterVolumeSpecName: "util") pod "adbd557b-1dee-4986-93f0-f4289c130638" (UID: "adbd557b-1dee-4986-93f0-f4289c130638"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:06:30.531022 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:30.530919 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/adbd557b-1dee-4986-93f0-f4289c130638-bundle\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:06:30.531022 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:30.530958 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f6pd4\" (UniqueName: \"kubernetes.io/projected/adbd557b-1dee-4986-93f0-f4289c130638-kube-api-access-f6pd4\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:06:30.531022 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:30.530968 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/adbd557b-1dee-4986-93f0-f4289c130638-util\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:06:31.206298 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:31.206255 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" event={"ID":"adbd557b-1dee-4986-93f0-f4289c130638","Type":"ContainerDied","Data":"a7d0407989f3d38c89a21b57aac50ead9df8858d95eb9265ea05ebd65813f5f5"} Apr 22 16:06:31.206298 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:31.206299 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7d0407989f3d38c89a21b57aac50ead9df8858d95eb9265ea05ebd65813f5f5" Apr 22 16:06:31.206805 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:31.206413 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebb8z49" Apr 22 16:06:33.158866 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:33.158837 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-5krlx" Apr 22 16:06:41.209581 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:41.209510 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hlcnl" Apr 22 16:06:42.248802 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.248764 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn"] Apr 22 16:06:42.249188 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.249128 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adbd557b-1dee-4986-93f0-f4289c130638" containerName="pull" Apr 22 16:06:42.249188 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.249141 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbd557b-1dee-4986-93f0-f4289c130638" containerName="pull" Apr 22 16:06:42.249188 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.249158 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adbd557b-1dee-4986-93f0-f4289c130638" containerName="util" Apr 22 16:06:42.249188 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.249163 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbd557b-1dee-4986-93f0-f4289c130638" containerName="util" Apr 22 16:06:42.249188 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.249176 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adbd557b-1dee-4986-93f0-f4289c130638" containerName="extract" Apr 22 16:06:42.249188 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.249183 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbd557b-1dee-4986-93f0-f4289c130638" containerName="extract" Apr 22 16:06:42.249390 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.249240 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="adbd557b-1dee-4986-93f0-f4289c130638" containerName="extract" Apr 22 16:06:42.252447 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.252425 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.256113 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.256076 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 22 16:06:42.256362 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.256337 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 16:06:42.256444 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.256338 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 22 16:06:42.256444 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.256432 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-mqbsc\"" Apr 22 16:06:42.256560 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.256497 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 22 16:06:42.277428 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.277387 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn"] Apr 22 16:06:42.343277 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.343229 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwks5\" (UniqueName: \"kubernetes.io/projected/184a27b6-8d63-420b-ab37-a48184d2303c-kube-api-access-qwks5\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.343443 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.343303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/184a27b6-8d63-420b-ab37-a48184d2303c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.343443 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.343333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/184a27b6-8d63-420b-ab37-a48184d2303c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.343443 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.343351 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/184a27b6-8d63-420b-ab37-a48184d2303c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.343443 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.343377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/184a27b6-8d63-420b-ab37-a48184d2303c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.343443 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.343417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/184a27b6-8d63-420b-ab37-a48184d2303c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.343687 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.343454 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/184a27b6-8d63-420b-ab37-a48184d2303c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.444637 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.444590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/184a27b6-8d63-420b-ab37-a48184d2303c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.444859 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.444646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/184a27b6-8d63-420b-ab37-a48184d2303c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.444859 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.444696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/184a27b6-8d63-420b-ab37-a48184d2303c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.444859 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.444752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwks5\" (UniqueName: \"kubernetes.io/projected/184a27b6-8d63-420b-ab37-a48184d2303c-kube-api-access-qwks5\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.444859 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.444818 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/184a27b6-8d63-420b-ab37-a48184d2303c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.445090 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.444865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/184a27b6-8d63-420b-ab37-a48184d2303c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.445090 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.444889 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/184a27b6-8d63-420b-ab37-a48184d2303c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.445444 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.445409 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/184a27b6-8d63-420b-ab37-a48184d2303c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.447505 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.447480 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/184a27b6-8d63-420b-ab37-a48184d2303c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.447711 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.447690 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/184a27b6-8d63-420b-ab37-a48184d2303c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.447756 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.447724 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/184a27b6-8d63-420b-ab37-a48184d2303c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.447819 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.447798 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/184a27b6-8d63-420b-ab37-a48184d2303c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.458965 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.458936 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/184a27b6-8d63-420b-ab37-a48184d2303c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.459563 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.459520 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwks5\" (UniqueName: \"kubernetes.io/projected/184a27b6-8d63-420b-ab37-a48184d2303c-kube-api-access-qwks5\") pod \"istiod-openshift-gateway-55ff986f96-pgtbn\" (UID: \"184a27b6-8d63-420b-ab37-a48184d2303c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.563029 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.562937 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:42.722856 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:42.722672 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn"] Apr 22 16:06:42.725981 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:06:42.725949 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod184a27b6_8d63_420b_ab37_a48184d2303c.slice/crio-66d73453eb82cc23dba05d2ccbec7897f4d390625eb69d5fea4d060ed88ef176 WatchSource:0}: Error finding container 66d73453eb82cc23dba05d2ccbec7897f4d390625eb69d5fea4d060ed88ef176: Status 404 returned error can't find the container with id 66d73453eb82cc23dba05d2ccbec7897f4d390625eb69d5fea4d060ed88ef176 Apr 22 16:06:43.254912 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:43.254876 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" event={"ID":"184a27b6-8d63-420b-ab37-a48184d2303c","Type":"ContainerStarted","Data":"66d73453eb82cc23dba05d2ccbec7897f4d390625eb69d5fea4d060ed88ef176"} Apr 22 16:06:45.365505 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:45.365456 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 22 16:06:45.365866 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:45.365550 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 22 16:06:46.273829 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:46.273791 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" event={"ID":"184a27b6-8d63-420b-ab37-a48184d2303c","Type":"ContainerStarted","Data":"eb09cb080bd22828dd223db8570a4151d8ab71c21acbff2cd0be17f211197608"} Apr 22 16:06:46.274069 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:46.274026 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:46.275861 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:46.275823 2576 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-pgtbn container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 22 16:06:46.275997 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:46.275909 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" podUID="184a27b6-8d63-420b-ab37-a48184d2303c" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 16:06:46.295798 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:46.295735 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" podStartSLOduration=1.658778834 podStartE2EDuration="4.29571552s" podCreationTimestamp="2026-04-22 16:06:42 +0000 UTC" firstStartedPulling="2026-04-22 16:06:42.728254954 +0000 UTC m=+480.020443687" lastFinishedPulling="2026-04-22 16:06:45.365191664 +0000 UTC m=+482.657380373" observedRunningTime="2026-04-22 16:06:46.293171245 +0000 UTC m=+483.585359970" watchObservedRunningTime="2026-04-22 16:06:46.29571552 +0000 UTC m=+483.587904244" Apr 22 16:06:47.279282 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:47.279243 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-pgtbn" Apr 22 16:06:56.180869 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:06:56.180831 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-tvp8f" Apr 22 16:07:51.833303 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:07:51.833209 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb"] Apr 22 16:07:51.836941 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:07:51.836913 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" Apr 22 16:07:51.839111 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:07:51.839084 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 16:07:51.839247 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:07:51.839096 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 16:07:51.839731 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:07:51.839702 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-lgggn\"" Apr 22 16:07:51.847275 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:07:51.847247 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb"] Apr 22 16:07:51.872135 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:07:51.872096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4snfv\" (UniqueName: \"kubernetes.io/projected/396b6336-9580-45f6-8667-2b199c132f5a-kube-api-access-4snfv\") pod \"limitador-operator-controller-manager-85c4996f8c-lbwnb\" (UID: \"396b6336-9580-45f6-8667-2b199c132f5a\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" Apr 22 16:07:51.973153 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:07:51.973114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4snfv\" (UniqueName: \"kubernetes.io/projected/396b6336-9580-45f6-8667-2b199c132f5a-kube-api-access-4snfv\") pod \"limitador-operator-controller-manager-85c4996f8c-lbwnb\" (UID: \"396b6336-9580-45f6-8667-2b199c132f5a\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" Apr 22 16:07:51.983850 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:07:51.983814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4snfv\" (UniqueName: \"kubernetes.io/projected/396b6336-9580-45f6-8667-2b199c132f5a-kube-api-access-4snfv\") pod \"limitador-operator-controller-manager-85c4996f8c-lbwnb\" (UID: \"396b6336-9580-45f6-8667-2b199c132f5a\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" Apr 22 16:07:52.149003 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:07:52.148883 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" Apr 22 16:07:52.285063 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:07:52.284897 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb"] Apr 22 16:07:52.288084 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:07:52.288053 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod396b6336_9580_45f6_8667_2b199c132f5a.slice/crio-ef6a3fbc5a74fd974a46bf3f3b019a5cce618dd05f73f5696999b7948052d4ee WatchSource:0}: Error finding container ef6a3fbc5a74fd974a46bf3f3b019a5cce618dd05f73f5696999b7948052d4ee: Status 404 returned error can't find the container with id ef6a3fbc5a74fd974a46bf3f3b019a5cce618dd05f73f5696999b7948052d4ee Apr 22 16:07:52.530192 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:07:52.530156 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" event={"ID":"396b6336-9580-45f6-8667-2b199c132f5a","Type":"ContainerStarted","Data":"ef6a3fbc5a74fd974a46bf3f3b019a5cce618dd05f73f5696999b7948052d4ee"} Apr 22 16:07:55.544477 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:07:55.544430 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" event={"ID":"396b6336-9580-45f6-8667-2b199c132f5a","Type":"ContainerStarted","Data":"8194346e7414f73ae927836b8bccd27461473926588fe5b1d14db883efa72f52"} Apr 22 16:07:55.545039 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:07:55.544504 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" Apr 22 16:07:55.562028 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:07:55.561959 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" podStartSLOduration=1.999154872 podStartE2EDuration="4.561940587s" podCreationTimestamp="2026-04-22 16:07:51 +0000 UTC" firstStartedPulling="2026-04-22 16:07:52.290165744 +0000 UTC m=+549.582354445" lastFinishedPulling="2026-04-22 16:07:54.852951444 +0000 UTC m=+552.145140160" observedRunningTime="2026-04-22 16:07:55.560557818 +0000 UTC m=+552.852746532" watchObservedRunningTime="2026-04-22 16:07:55.561940587 +0000 UTC m=+552.854129311" Apr 22 16:08:03.636235 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:03.636185 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp"] Apr 22 16:08:03.639868 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:03.639844 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp" Apr 22 16:08:03.642210 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:03.642184 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-ws92q\"" Apr 22 16:08:03.652517 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:03.652484 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp"] Apr 22 16:08:03.673002 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:03.672951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gwmm\" (UniqueName: \"kubernetes.io/projected/1a86337f-7a25-4e13-bdab-efcd69ba858d-kube-api-access-2gwmm\") pod \"kuadrant-operator-controller-manager-84b657d985-k7qvp\" (UID: \"1a86337f-7a25-4e13-bdab-efcd69ba858d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp" Apr 22 16:08:03.673206 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:03.673125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1a86337f-7a25-4e13-bdab-efcd69ba858d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-k7qvp\" (UID: \"1a86337f-7a25-4e13-bdab-efcd69ba858d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp" Apr 22 16:08:03.774240 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:03.774198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1a86337f-7a25-4e13-bdab-efcd69ba858d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-k7qvp\" (UID: \"1a86337f-7a25-4e13-bdab-efcd69ba858d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp" Apr 22 16:08:03.774422 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:03.774262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gwmm\" (UniqueName: \"kubernetes.io/projected/1a86337f-7a25-4e13-bdab-efcd69ba858d-kube-api-access-2gwmm\") pod \"kuadrant-operator-controller-manager-84b657d985-k7qvp\" (UID: \"1a86337f-7a25-4e13-bdab-efcd69ba858d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp" Apr 22 16:08:03.774665 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:03.774644 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1a86337f-7a25-4e13-bdab-efcd69ba858d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-k7qvp\" (UID: \"1a86337f-7a25-4e13-bdab-efcd69ba858d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp" Apr 22 16:08:03.790492 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:03.790454 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gwmm\" (UniqueName: \"kubernetes.io/projected/1a86337f-7a25-4e13-bdab-efcd69ba858d-kube-api-access-2gwmm\") pod \"kuadrant-operator-controller-manager-84b657d985-k7qvp\" (UID: \"1a86337f-7a25-4e13-bdab-efcd69ba858d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp" Apr 22 16:08:03.953337 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:03.953297 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp" Apr 22 16:08:04.092795 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.092765 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp"] Apr 22 16:08:04.095212 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:08:04.095180 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a86337f_7a25_4e13_bdab_efcd69ba858d.slice/crio-8e4fccea542bd9f208b082c0341405a34140acbc5e329e412e972f5eef3cc402 WatchSource:0}: Error finding container 8e4fccea542bd9f208b082c0341405a34140acbc5e329e412e972f5eef3cc402: Status 404 returned error can't find the container with id 8e4fccea542bd9f208b082c0341405a34140acbc5e329e412e972f5eef3cc402 Apr 22 16:08:04.135847 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.135805 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp"] Apr 22 16:08:04.144371 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.144328 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp"] Apr 22 16:08:04.161050 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.160985 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr"] Apr 22 16:08:04.166176 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.166144 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb"] Apr 22 16:08:04.166350 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.166309 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" Apr 22 16:08:04.166644 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.166447 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" podUID="396b6336-9580-45f6-8667-2b199c132f5a" containerName="manager" containerID="cri-o://8194346e7414f73ae927836b8bccd27461473926588fe5b1d14db883efa72f52" gracePeriod=2 Apr 22 16:08:04.168910 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.168796 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" Apr 22 16:08:04.176810 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.176139 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb"] Apr 22 16:08:04.179675 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.179369 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr"] Apr 22 16:08:04.184448 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.184414 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-t64x5"] Apr 22 16:08:04.184837 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.184821 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="396b6336-9580-45f6-8667-2b199c132f5a" containerName="manager" Apr 22 16:08:04.184837 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.184838 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="396b6336-9580-45f6-8667-2b199c132f5a" containerName="manager" Apr 22 16:08:04.184939 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.184898 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="396b6336-9580-45f6-8667-2b199c132f5a" containerName="manager" Apr 22 16:08:04.187971 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.187936 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-t64x5" Apr 22 16:08:04.194060 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.194018 2576 status_manager.go:895] "Failed to get status for pod" podUID="396b6336-9580-45f6-8667-2b199c132f5a" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" err="pods \"limitador-operator-controller-manager-85c4996f8c-lbwnb\" is forbidden: User \"system:node:ip-10-0-132-57.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-57.ec2.internal' and this object" Apr 22 16:08:04.196038 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.195991 2576 status_manager.go:895] "Failed to get status for pod" podUID="396b6336-9580-45f6-8667-2b199c132f5a" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" err="pods \"limitador-operator-controller-manager-85c4996f8c-lbwnb\" is forbidden: User \"system:node:ip-10-0-132-57.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-57.ec2.internal' and this object" Apr 22 16:08:04.198393 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.198364 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-t64x5"] Apr 22 16:08:04.279593 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.279549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c77d5\" (UniqueName: \"kubernetes.io/projected/96a9549b-1ed7-4a10-a376-55a50d088313-kube-api-access-c77d5\") pod \"kuadrant-operator-controller-manager-84b657d985-cbvsr\" (UID: \"96a9549b-1ed7-4a10-a376-55a50d088313\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" Apr 22 16:08:04.279787 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.279605 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngfc8\" (UniqueName: \"kubernetes.io/projected/f538ea4a-710e-4b41-bc9c-ccb286ac8d8d-kube-api-access-ngfc8\") pod \"limitador-operator-controller-manager-85c4996f8c-t64x5\" (UID: \"f538ea4a-710e-4b41-bc9c-ccb286ac8d8d\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-t64x5" Apr 22 16:08:04.279787 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.279695 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/96a9549b-1ed7-4a10-a376-55a50d088313-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-cbvsr\" (UID: \"96a9549b-1ed7-4a10-a376-55a50d088313\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" Apr 22 16:08:04.380247 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.380201 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/96a9549b-1ed7-4a10-a376-55a50d088313-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-cbvsr\" (UID: \"96a9549b-1ed7-4a10-a376-55a50d088313\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" Apr 22 16:08:04.380453 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.380311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c77d5\" (UniqueName: \"kubernetes.io/projected/96a9549b-1ed7-4a10-a376-55a50d088313-kube-api-access-c77d5\") pod \"kuadrant-operator-controller-manager-84b657d985-cbvsr\" (UID: \"96a9549b-1ed7-4a10-a376-55a50d088313\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" Apr 22 16:08:04.380453 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.380338 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngfc8\" (UniqueName: \"kubernetes.io/projected/f538ea4a-710e-4b41-bc9c-ccb286ac8d8d-kube-api-access-ngfc8\") pod \"limitador-operator-controller-manager-85c4996f8c-t64x5\" (UID: \"f538ea4a-710e-4b41-bc9c-ccb286ac8d8d\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-t64x5" Apr 22 16:08:04.380692 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.380670 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/96a9549b-1ed7-4a10-a376-55a50d088313-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-cbvsr\" (UID: \"96a9549b-1ed7-4a10-a376-55a50d088313\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" Apr 22 16:08:04.390933 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.390899 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c77d5\" (UniqueName: \"kubernetes.io/projected/96a9549b-1ed7-4a10-a376-55a50d088313-kube-api-access-c77d5\") pod \"kuadrant-operator-controller-manager-84b657d985-cbvsr\" (UID: \"96a9549b-1ed7-4a10-a376-55a50d088313\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" Apr 22 16:08:04.391233 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.391210 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngfc8\" (UniqueName: \"kubernetes.io/projected/f538ea4a-710e-4b41-bc9c-ccb286ac8d8d-kube-api-access-ngfc8\") pod \"limitador-operator-controller-manager-85c4996f8c-t64x5\" (UID: \"f538ea4a-710e-4b41-bc9c-ccb286ac8d8d\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-t64x5" Apr 22 16:08:04.405628 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.405602 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" Apr 22 16:08:04.407699 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.407666 2576 status_manager.go:895] "Failed to get status for pod" podUID="396b6336-9580-45f6-8667-2b199c132f5a" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" err="pods \"limitador-operator-controller-manager-85c4996f8c-lbwnb\" is forbidden: User \"system:node:ip-10-0-132-57.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-57.ec2.internal' and this object" Apr 22 16:08:04.481177 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.481136 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4snfv\" (UniqueName: \"kubernetes.io/projected/396b6336-9580-45f6-8667-2b199c132f5a-kube-api-access-4snfv\") pod \"396b6336-9580-45f6-8667-2b199c132f5a\" (UID: \"396b6336-9580-45f6-8667-2b199c132f5a\") " Apr 22 16:08:04.483652 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.483617 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/396b6336-9580-45f6-8667-2b199c132f5a-kube-api-access-4snfv" (OuterVolumeSpecName: "kube-api-access-4snfv") pod "396b6336-9580-45f6-8667-2b199c132f5a" (UID: "396b6336-9580-45f6-8667-2b199c132f5a"). InnerVolumeSpecName "kube-api-access-4snfv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:08:04.555563 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.555450 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" Apr 22 16:08:04.562346 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.562313 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-t64x5" Apr 22 16:08:04.582553 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.582500 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4snfv\" (UniqueName: \"kubernetes.io/projected/396b6336-9580-45f6-8667-2b199c132f5a-kube-api-access-4snfv\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:08:04.586224 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.586188 2576 generic.go:358] "Generic (PLEG): container finished" podID="396b6336-9580-45f6-8667-2b199c132f5a" containerID="8194346e7414f73ae927836b8bccd27461473926588fe5b1d14db883efa72f52" exitCode=0 Apr 22 16:08:04.586224 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.586249 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" Apr 22 16:08:04.586224 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.586286 2576 scope.go:117] "RemoveContainer" containerID="8194346e7414f73ae927836b8bccd27461473926588fe5b1d14db883efa72f52" Apr 22 16:08:04.588673 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.588647 2576 status_manager.go:895] "Failed to get status for pod" podUID="396b6336-9580-45f6-8667-2b199c132f5a" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" err="pods \"limitador-operator-controller-manager-85c4996f8c-lbwnb\" is forbidden: User \"system:node:ip-10-0-132-57.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-57.ec2.internal' and this object" Apr 22 16:08:04.600854 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.600810 2576 status_manager.go:895] "Failed to get status for pod" podUID="396b6336-9580-45f6-8667-2b199c132f5a" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-lbwnb" err="pods \"limitador-operator-controller-manager-85c4996f8c-lbwnb\" is forbidden: User \"system:node:ip-10-0-132-57.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-57.ec2.internal' and this object" Apr 22 16:08:04.728913 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.728717 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr"] Apr 22 16:08:04.772814 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.772774 2576 scope.go:117] "RemoveContainer" containerID="8194346e7414f73ae927836b8bccd27461473926588fe5b1d14db883efa72f52" Apr 22 16:08:04.773258 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:08:04.773229 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8194346e7414f73ae927836b8bccd27461473926588fe5b1d14db883efa72f52\": container with ID starting with 8194346e7414f73ae927836b8bccd27461473926588fe5b1d14db883efa72f52 not found: ID does not exist" containerID="8194346e7414f73ae927836b8bccd27461473926588fe5b1d14db883efa72f52" Apr 22 16:08:04.773341 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.773274 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8194346e7414f73ae927836b8bccd27461473926588fe5b1d14db883efa72f52"} err="failed to get container status \"8194346e7414f73ae927836b8bccd27461473926588fe5b1d14db883efa72f52\": rpc error: code = NotFound desc = could not find container \"8194346e7414f73ae927836b8bccd27461473926588fe5b1d14db883efa72f52\": container with ID starting with 8194346e7414f73ae927836b8bccd27461473926588fe5b1d14db883efa72f52 not found: ID does not exist" Apr 22 16:08:04.777471 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:08:04.777438 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a9549b_1ed7_4a10_a376_55a50d088313.slice/crio-5db3e9a546b86e6dde0e8685f98ab5663f1ffd84f98092a6fa25e89eec778262 WatchSource:0}: Error finding container 5db3e9a546b86e6dde0e8685f98ab5663f1ffd84f98092a6fa25e89eec778262: Status 404 returned error can't find the container with id 5db3e9a546b86e6dde0e8685f98ab5663f1ffd84f98092a6fa25e89eec778262 Apr 22 16:08:04.936800 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:04.936731 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-t64x5"] Apr 22 16:08:04.968568 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:08:04.968084 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf538ea4a_710e_4b41_bc9c_ccb286ac8d8d.slice/crio-1a9f9be594aec93b554788e5739d9a1aebaaa8b4a70738e9ea6b4d36343843d7 WatchSource:0}: Error finding container 1a9f9be594aec93b554788e5739d9a1aebaaa8b4a70738e9ea6b4d36343843d7: Status 404 returned error can't find the container with id 1a9f9be594aec93b554788e5739d9a1aebaaa8b4a70738e9ea6b4d36343843d7 Apr 22 16:08:05.395097 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:05.395057 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="396b6336-9580-45f6-8667-2b199c132f5a" path="/var/lib/kubelet/pods/396b6336-9580-45f6-8667-2b199c132f5a/volumes" Apr 22 16:08:05.595404 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:05.595289 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-t64x5" event={"ID":"f538ea4a-710e-4b41-bc9c-ccb286ac8d8d","Type":"ContainerStarted","Data":"635e329c49ffefc8fc251178bfbec4a2df2aeedd6884ae34e0f9aacb284c7e87"} Apr 22 16:08:05.595404 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:05.595344 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-t64x5" event={"ID":"f538ea4a-710e-4b41-bc9c-ccb286ac8d8d","Type":"ContainerStarted","Data":"1a9f9be594aec93b554788e5739d9a1aebaaa8b4a70738e9ea6b4d36343843d7"} Apr 22 16:08:05.595739 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:05.595459 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-t64x5" Apr 22 16:08:05.598700 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:05.598657 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" event={"ID":"96a9549b-1ed7-4a10-a376-55a50d088313","Type":"ContainerStarted","Data":"5db3e9a546b86e6dde0e8685f98ab5663f1ffd84f98092a6fa25e89eec778262"} Apr 22 16:08:05.615351 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:05.615280 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-t64x5" podStartSLOduration=1.61525447 podStartE2EDuration="1.61525447s" podCreationTimestamp="2026-04-22 16:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:08:05.613583355 +0000 UTC m=+562.905772081" watchObservedRunningTime="2026-04-22 16:08:05.61525447 +0000 UTC m=+562.907443198" Apr 22 16:08:08.615824 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:08.615713 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp" podUID="1a86337f-7a25-4e13-bdab-efcd69ba858d" containerName="manager" containerID="cri-o://6cd14708240c318f14a7920bff0e9cf3040aba3fd92956e421b8e857990f2180" gracePeriod=2 Apr 22 16:08:08.617688 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:08.617653 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" event={"ID":"96a9549b-1ed7-4a10-a376-55a50d088313","Type":"ContainerStarted","Data":"584814909ac1dc6d30107a2a17abe1f8128fe6fbea93863544ab076adc47096a"} Apr 22 16:08:08.617826 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:08.617787 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" Apr 22 16:08:08.640153 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:08.640091 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" podStartSLOduration=1.057976034 podStartE2EDuration="4.640072753s" podCreationTimestamp="2026-04-22 16:08:04 +0000 UTC" firstStartedPulling="2026-04-22 16:08:04.780610522 +0000 UTC m=+562.072799223" lastFinishedPulling="2026-04-22 16:08:08.36270724 +0000 UTC m=+565.654895942" observedRunningTime="2026-04-22 16:08:08.637287925 +0000 UTC m=+565.929476650" watchObservedRunningTime="2026-04-22 16:08:08.640072753 +0000 UTC m=+565.932261476" Apr 22 16:08:08.872336 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:08.872218 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp" Apr 22 16:08:08.874388 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:08.874350 2576 status_manager.go:895] "Failed to get status for pod" podUID="1a86337f-7a25-4e13-bdab-efcd69ba858d" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp" err="pods \"kuadrant-operator-controller-manager-84b657d985-k7qvp\" is forbidden: User \"system:node:ip-10-0-132-57.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-57.ec2.internal' and this object" Apr 22 16:08:08.929560 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:08.929479 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gwmm\" (UniqueName: \"kubernetes.io/projected/1a86337f-7a25-4e13-bdab-efcd69ba858d-kube-api-access-2gwmm\") pod \"1a86337f-7a25-4e13-bdab-efcd69ba858d\" (UID: \"1a86337f-7a25-4e13-bdab-efcd69ba858d\") " Apr 22 16:08:08.929840 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:08.929692 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1a86337f-7a25-4e13-bdab-efcd69ba858d-extensions-socket-volume\") pod \"1a86337f-7a25-4e13-bdab-efcd69ba858d\" (UID: \"1a86337f-7a25-4e13-bdab-efcd69ba858d\") " Apr 22 16:08:08.929998 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:08.929967 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a86337f-7a25-4e13-bdab-efcd69ba858d-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "1a86337f-7a25-4e13-bdab-efcd69ba858d" (UID: "1a86337f-7a25-4e13-bdab-efcd69ba858d"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:08:08.932374 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:08.932329 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a86337f-7a25-4e13-bdab-efcd69ba858d-kube-api-access-2gwmm" (OuterVolumeSpecName: "kube-api-access-2gwmm") pod "1a86337f-7a25-4e13-bdab-efcd69ba858d" (UID: "1a86337f-7a25-4e13-bdab-efcd69ba858d"). InnerVolumeSpecName "kube-api-access-2gwmm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:08:09.031388 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:09.031343 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1a86337f-7a25-4e13-bdab-efcd69ba858d-extensions-socket-volume\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:08:09.031388 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:09.031378 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2gwmm\" (UniqueName: \"kubernetes.io/projected/1a86337f-7a25-4e13-bdab-efcd69ba858d-kube-api-access-2gwmm\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:08:09.395703 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:09.395665 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a86337f-7a25-4e13-bdab-efcd69ba858d" path="/var/lib/kubelet/pods/1a86337f-7a25-4e13-bdab-efcd69ba858d/volumes" Apr 22 16:08:09.622704 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:09.622662 2576 generic.go:358] "Generic (PLEG): container finished" podID="1a86337f-7a25-4e13-bdab-efcd69ba858d" containerID="6cd14708240c318f14a7920bff0e9cf3040aba3fd92956e421b8e857990f2180" exitCode=2 Apr 22 16:08:09.623151 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:09.622713 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp" Apr 22 16:08:09.623151 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:09.622759 2576 scope.go:117] "RemoveContainer" containerID="6cd14708240c318f14a7920bff0e9cf3040aba3fd92956e421b8e857990f2180" Apr 22 16:08:09.627472 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:09.627441 2576 status_manager.go:895] "Failed to get status for pod" podUID="1a86337f-7a25-4e13-bdab-efcd69ba858d" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-k7qvp" err="pods \"kuadrant-operator-controller-manager-84b657d985-k7qvp\" is forbidden: User \"system:node:ip-10-0-132-57.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-57.ec2.internal' and this object" Apr 22 16:08:09.631769 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:09.631750 2576 scope.go:117] "RemoveContainer" containerID="6cd14708240c318f14a7920bff0e9cf3040aba3fd92956e421b8e857990f2180" Apr 22 16:08:09.632064 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:08:09.632044 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cd14708240c318f14a7920bff0e9cf3040aba3fd92956e421b8e857990f2180\": container with ID starting with 6cd14708240c318f14a7920bff0e9cf3040aba3fd92956e421b8e857990f2180 not found: ID does not exist" containerID="6cd14708240c318f14a7920bff0e9cf3040aba3fd92956e421b8e857990f2180" Apr 22 16:08:09.632112 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:09.632074 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cd14708240c318f14a7920bff0e9cf3040aba3fd92956e421b8e857990f2180"} err="failed to get container status \"6cd14708240c318f14a7920bff0e9cf3040aba3fd92956e421b8e857990f2180\": rpc error: code = NotFound desc = could not find container \"6cd14708240c318f14a7920bff0e9cf3040aba3fd92956e421b8e857990f2180\": container with ID starting with 6cd14708240c318f14a7920bff0e9cf3040aba3fd92956e421b8e857990f2180 not found: ID does not exist" Apr 22 16:08:16.607659 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:16.607623 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-t64x5" Apr 22 16:08:19.625673 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:19.625638 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" Apr 22 16:08:31.549135 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:31.549099 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr"] Apr 22 16:08:31.549714 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:31.549332 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" podUID="96a9549b-1ed7-4a10-a376-55a50d088313" containerName="manager" containerID="cri-o://584814909ac1dc6d30107a2a17abe1f8128fe6fbea93863544ab076adc47096a" gracePeriod=10 Apr 22 16:08:31.712281 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:31.712248 2576 generic.go:358] "Generic (PLEG): container finished" podID="96a9549b-1ed7-4a10-a376-55a50d088313" containerID="584814909ac1dc6d30107a2a17abe1f8128fe6fbea93863544ab076adc47096a" exitCode=0 Apr 22 16:08:31.712446 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:31.712304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" event={"ID":"96a9549b-1ed7-4a10-a376-55a50d088313","Type":"ContainerDied","Data":"584814909ac1dc6d30107a2a17abe1f8128fe6fbea93863544ab076adc47096a"} Apr 22 16:08:31.805358 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:31.805286 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" Apr 22 16:08:31.944067 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:31.944027 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c77d5\" (UniqueName: \"kubernetes.io/projected/96a9549b-1ed7-4a10-a376-55a50d088313-kube-api-access-c77d5\") pod \"96a9549b-1ed7-4a10-a376-55a50d088313\" (UID: \"96a9549b-1ed7-4a10-a376-55a50d088313\") " Apr 22 16:08:31.944067 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:31.944066 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/96a9549b-1ed7-4a10-a376-55a50d088313-extensions-socket-volume\") pod \"96a9549b-1ed7-4a10-a376-55a50d088313\" (UID: \"96a9549b-1ed7-4a10-a376-55a50d088313\") " Apr 22 16:08:31.944713 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:31.944676 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96a9549b-1ed7-4a10-a376-55a50d088313-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "96a9549b-1ed7-4a10-a376-55a50d088313" (UID: "96a9549b-1ed7-4a10-a376-55a50d088313"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:08:31.946366 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:31.946344 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96a9549b-1ed7-4a10-a376-55a50d088313-kube-api-access-c77d5" (OuterVolumeSpecName: "kube-api-access-c77d5") pod "96a9549b-1ed7-4a10-a376-55a50d088313" (UID: "96a9549b-1ed7-4a10-a376-55a50d088313"). InnerVolumeSpecName "kube-api-access-c77d5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:08:32.045318 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:32.045270 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c77d5\" (UniqueName: \"kubernetes.io/projected/96a9549b-1ed7-4a10-a376-55a50d088313-kube-api-access-c77d5\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:08:32.045318 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:32.045305 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/96a9549b-1ed7-4a10-a376-55a50d088313-extensions-socket-volume\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:08:32.718070 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:32.718038 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" Apr 22 16:08:32.718598 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:32.718043 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr" event={"ID":"96a9549b-1ed7-4a10-a376-55a50d088313","Type":"ContainerDied","Data":"5db3e9a546b86e6dde0e8685f98ab5663f1ffd84f98092a6fa25e89eec778262"} Apr 22 16:08:32.718598 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:32.718167 2576 scope.go:117] "RemoveContainer" containerID="584814909ac1dc6d30107a2a17abe1f8128fe6fbea93863544ab076adc47096a" Apr 22 16:08:32.747596 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:32.747560 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr"] Apr 22 16:08:32.750046 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:32.750014 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-cbvsr"] Apr 22 16:08:33.392607 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:33.392573 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96a9549b-1ed7-4a10-a376-55a50d088313" path="/var/lib/kubelet/pods/96a9549b-1ed7-4a10-a376-55a50d088313/volumes" Apr 22 16:08:43.292286 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:43.292254 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/ovn-acl-logging/0.log" Apr 22 16:08:43.294554 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:43.294504 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/ovn-acl-logging/0.log" Apr 22 16:08:48.791018 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:48.790981 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7m8w9"] Apr 22 16:08:48.791412 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:48.791343 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96a9549b-1ed7-4a10-a376-55a50d088313" containerName="manager" Apr 22 16:08:48.791412 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:48.791354 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a9549b-1ed7-4a10-a376-55a50d088313" containerName="manager" Apr 22 16:08:48.791412 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:48.791366 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a86337f-7a25-4e13-bdab-efcd69ba858d" containerName="manager" Apr 22 16:08:48.791412 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:48.791372 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a86337f-7a25-4e13-bdab-efcd69ba858d" containerName="manager" Apr 22 16:08:48.791671 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:48.791424 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="96a9549b-1ed7-4a10-a376-55a50d088313" containerName="manager" Apr 22 16:08:48.791671 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:48.791435 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a86337f-7a25-4e13-bdab-efcd69ba858d" containerName="manager" Apr 22 16:08:48.795217 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:48.795190 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" Apr 22 16:08:48.797334 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:48.797296 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 16:08:48.797471 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:48.797344 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-c25fv\"" Apr 22 16:08:48.801048 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:48.801007 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7m8w9"] Apr 22 16:08:48.897938 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:48.897900 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7m8w9"] Apr 22 16:08:48.898161 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:48.898138 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zlqz\" (UniqueName: \"kubernetes.io/projected/68d1d607-7180-4347-89ce-29f206554ba4-kube-api-access-2zlqz\") pod \"limitador-limitador-7d549b5b-7m8w9\" (UID: \"68d1d607-7180-4347-89ce-29f206554ba4\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" Apr 22 16:08:48.898228 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:48.898211 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/68d1d607-7180-4347-89ce-29f206554ba4-config-file\") pod \"limitador-limitador-7d549b5b-7m8w9\" (UID: \"68d1d607-7180-4347-89ce-29f206554ba4\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" Apr 22 16:08:48.999242 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:48.999194 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zlqz\" (UniqueName: \"kubernetes.io/projected/68d1d607-7180-4347-89ce-29f206554ba4-kube-api-access-2zlqz\") pod \"limitador-limitador-7d549b5b-7m8w9\" (UID: \"68d1d607-7180-4347-89ce-29f206554ba4\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" Apr 22 16:08:48.999445 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:48.999280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/68d1d607-7180-4347-89ce-29f206554ba4-config-file\") pod \"limitador-limitador-7d549b5b-7m8w9\" (UID: \"68d1d607-7180-4347-89ce-29f206554ba4\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" Apr 22 16:08:48.999940 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:48.999920 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/68d1d607-7180-4347-89ce-29f206554ba4-config-file\") pod \"limitador-limitador-7d549b5b-7m8w9\" (UID: \"68d1d607-7180-4347-89ce-29f206554ba4\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" Apr 22 16:08:49.008290 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:49.008256 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zlqz\" (UniqueName: \"kubernetes.io/projected/68d1d607-7180-4347-89ce-29f206554ba4-kube-api-access-2zlqz\") pod \"limitador-limitador-7d549b5b-7m8w9\" (UID: \"68d1d607-7180-4347-89ce-29f206554ba4\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" Apr 22 16:08:49.107729 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:49.107637 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" Apr 22 16:08:49.248385 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:49.248352 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7m8w9"] Apr 22 16:08:49.250767 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:08:49.250736 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d1d607_7180_4347_89ce_29f206554ba4.slice/crio-a6a67f9859cb6dbd6d17510b27af9e2a3b0f81a6a841590a58ed2f5be4212753 WatchSource:0}: Error finding container a6a67f9859cb6dbd6d17510b27af9e2a3b0f81a6a841590a58ed2f5be4212753: Status 404 returned error can't find the container with id a6a67f9859cb6dbd6d17510b27af9e2a3b0f81a6a841590a58ed2f5be4212753 Apr 22 16:08:49.789096 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:49.789055 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" event={"ID":"68d1d607-7180-4347-89ce-29f206554ba4","Type":"ContainerStarted","Data":"a6a67f9859cb6dbd6d17510b27af9e2a3b0f81a6a841590a58ed2f5be4212753"} Apr 22 16:08:52.804216 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:52.804175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" event={"ID":"68d1d607-7180-4347-89ce-29f206554ba4","Type":"ContainerStarted","Data":"1a0b269ff6982a00d058b031d84016ff17f8217a7fb3580ed965e30c080576b2"} Apr 22 16:08:52.804743 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:52.804346 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" Apr 22 16:08:52.819334 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:08:52.819281 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" podStartSLOduration=2.067194726 podStartE2EDuration="4.819265046s" podCreationTimestamp="2026-04-22 16:08:48 +0000 UTC" firstStartedPulling="2026-04-22 16:08:49.252674484 +0000 UTC m=+606.544863202" lastFinishedPulling="2026-04-22 16:08:52.00474482 +0000 UTC m=+609.296933522" observedRunningTime="2026-04-22 16:08:52.817306428 +0000 UTC m=+610.109495153" watchObservedRunningTime="2026-04-22 16:08:52.819265046 +0000 UTC m=+610.111453766" Apr 22 16:09:03.809164 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:03.809090 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" Apr 22 16:09:04.172186 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:04.172090 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7m8w9"] Apr 22 16:09:04.172355 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:04.172318 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" podUID="68d1d607-7180-4347-89ce-29f206554ba4" containerName="limitador" containerID="cri-o://1a0b269ff6982a00d058b031d84016ff17f8217a7fb3580ed965e30c080576b2" gracePeriod=30 Apr 22 16:09:04.853842 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:04.853755 2576 generic.go:358] "Generic (PLEG): container finished" podID="68d1d607-7180-4347-89ce-29f206554ba4" containerID="1a0b269ff6982a00d058b031d84016ff17f8217a7fb3580ed965e30c080576b2" exitCode=0 Apr 22 16:09:04.853842 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:04.853829 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" event={"ID":"68d1d607-7180-4347-89ce-29f206554ba4","Type":"ContainerDied","Data":"1a0b269ff6982a00d058b031d84016ff17f8217a7fb3580ed965e30c080576b2"} Apr 22 16:09:05.123211 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.123187 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" Apr 22 16:09:05.251449 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.251410 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zlqz\" (UniqueName: \"kubernetes.io/projected/68d1d607-7180-4347-89ce-29f206554ba4-kube-api-access-2zlqz\") pod \"68d1d607-7180-4347-89ce-29f206554ba4\" (UID: \"68d1d607-7180-4347-89ce-29f206554ba4\") " Apr 22 16:09:05.251642 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.251475 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/68d1d607-7180-4347-89ce-29f206554ba4-config-file\") pod \"68d1d607-7180-4347-89ce-29f206554ba4\" (UID: \"68d1d607-7180-4347-89ce-29f206554ba4\") " Apr 22 16:09:05.251859 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.251837 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68d1d607-7180-4347-89ce-29f206554ba4-config-file" (OuterVolumeSpecName: "config-file") pod "68d1d607-7180-4347-89ce-29f206554ba4" (UID: "68d1d607-7180-4347-89ce-29f206554ba4"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:09:05.253781 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.253757 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d1d607-7180-4347-89ce-29f206554ba4-kube-api-access-2zlqz" (OuterVolumeSpecName: "kube-api-access-2zlqz") pod "68d1d607-7180-4347-89ce-29f206554ba4" (UID: "68d1d607-7180-4347-89ce-29f206554ba4"). InnerVolumeSpecName "kube-api-access-2zlqz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:09:05.353225 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.353181 2576 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/68d1d607-7180-4347-89ce-29f206554ba4-config-file\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:09:05.353225 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.353218 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2zlqz\" (UniqueName: \"kubernetes.io/projected/68d1d607-7180-4347-89ce-29f206554ba4-kube-api-access-2zlqz\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:09:05.729203 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.729133 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-qdc6q"] Apr 22 16:09:05.730023 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.729993 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68d1d607-7180-4347-89ce-29f206554ba4" containerName="limitador" Apr 22 16:09:05.730023 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.730022 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d1d607-7180-4347-89ce-29f206554ba4" containerName="limitador" Apr 22 16:09:05.730206 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.730126 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="68d1d607-7180-4347-89ce-29f206554ba4" containerName="limitador" Apr 22 16:09:05.739431 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.739395 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-qdc6q"] Apr 22 16:09:05.739618 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.739520 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-qdc6q" Apr 22 16:09:05.741641 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.741611 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 22 16:09:05.742010 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.741661 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-76m6n\"" Apr 22 16:09:05.857976 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.857934 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2e8bd008-6134-4bfb-ac10-4e4903be78ea-data\") pod \"postgres-868db5846d-qdc6q\" (UID: \"2e8bd008-6134-4bfb-ac10-4e4903be78ea\") " pod="opendatahub/postgres-868db5846d-qdc6q" Apr 22 16:09:05.858420 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.857988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pnxr\" (UniqueName: \"kubernetes.io/projected/2e8bd008-6134-4bfb-ac10-4e4903be78ea-kube-api-access-6pnxr\") pod \"postgres-868db5846d-qdc6q\" (UID: \"2e8bd008-6134-4bfb-ac10-4e4903be78ea\") " pod="opendatahub/postgres-868db5846d-qdc6q" Apr 22 16:09:05.858949 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.858915 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" event={"ID":"68d1d607-7180-4347-89ce-29f206554ba4","Type":"ContainerDied","Data":"a6a67f9859cb6dbd6d17510b27af9e2a3b0f81a6a841590a58ed2f5be4212753"} Apr 22 16:09:05.859002 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.858970 2576 scope.go:117] "RemoveContainer" containerID="1a0b269ff6982a00d058b031d84016ff17f8217a7fb3580ed965e30c080576b2" Apr 22 16:09:05.859038 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.858931 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-7m8w9" Apr 22 16:09:05.876138 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.876083 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7m8w9"] Apr 22 16:09:05.880523 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.880486 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7m8w9"] Apr 22 16:09:05.958706 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.958664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2e8bd008-6134-4bfb-ac10-4e4903be78ea-data\") pod \"postgres-868db5846d-qdc6q\" (UID: \"2e8bd008-6134-4bfb-ac10-4e4903be78ea\") " pod="opendatahub/postgres-868db5846d-qdc6q" Apr 22 16:09:05.958706 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.958707 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pnxr\" (UniqueName: \"kubernetes.io/projected/2e8bd008-6134-4bfb-ac10-4e4903be78ea-kube-api-access-6pnxr\") pod \"postgres-868db5846d-qdc6q\" (UID: \"2e8bd008-6134-4bfb-ac10-4e4903be78ea\") " pod="opendatahub/postgres-868db5846d-qdc6q" Apr 22 16:09:05.959123 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.959101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2e8bd008-6134-4bfb-ac10-4e4903be78ea-data\") pod \"postgres-868db5846d-qdc6q\" (UID: \"2e8bd008-6134-4bfb-ac10-4e4903be78ea\") " pod="opendatahub/postgres-868db5846d-qdc6q" Apr 22 16:09:05.967017 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:05.966977 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pnxr\" (UniqueName: \"kubernetes.io/projected/2e8bd008-6134-4bfb-ac10-4e4903be78ea-kube-api-access-6pnxr\") pod \"postgres-868db5846d-qdc6q\" (UID: \"2e8bd008-6134-4bfb-ac10-4e4903be78ea\") " pod="opendatahub/postgres-868db5846d-qdc6q" Apr 22 16:09:06.058069 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:06.057966 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-qdc6q" Apr 22 16:09:06.192325 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:06.192275 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-qdc6q"] Apr 22 16:09:06.194829 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:09:06.194794 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e8bd008_6134_4bfb_ac10_4e4903be78ea.slice/crio-b50127e0e13c84a664456561572d68cdb04f113dc4f8dcdd0e906137f4ded7b4 WatchSource:0}: Error finding container b50127e0e13c84a664456561572d68cdb04f113dc4f8dcdd0e906137f4ded7b4: Status 404 returned error can't find the container with id b50127e0e13c84a664456561572d68cdb04f113dc4f8dcdd0e906137f4ded7b4 Apr 22 16:09:06.865066 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:06.865028 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-qdc6q" event={"ID":"2e8bd008-6134-4bfb-ac10-4e4903be78ea","Type":"ContainerStarted","Data":"b50127e0e13c84a664456561572d68cdb04f113dc4f8dcdd0e906137f4ded7b4"} Apr 22 16:09:07.396739 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:07.396333 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d1d607-7180-4347-89ce-29f206554ba4" path="/var/lib/kubelet/pods/68d1d607-7180-4347-89ce-29f206554ba4/volumes" Apr 22 16:09:11.888432 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:11.888378 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-qdc6q" event={"ID":"2e8bd008-6134-4bfb-ac10-4e4903be78ea","Type":"ContainerStarted","Data":"6295ab7ddfcff1a2a663d50b533d8938499df721c3a1068e7ee2d763eba9617c"} Apr 22 16:09:11.888976 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:11.888467 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-qdc6q" Apr 22 16:09:11.902800 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:11.902736 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-qdc6q" podStartSLOduration=1.91160457 podStartE2EDuration="6.902717714s" podCreationTimestamp="2026-04-22 16:09:05 +0000 UTC" firstStartedPulling="2026-04-22 16:09:06.196350393 +0000 UTC m=+623.488539113" lastFinishedPulling="2026-04-22 16:09:11.187463551 +0000 UTC m=+628.479652257" observedRunningTime="2026-04-22 16:09:11.901050787 +0000 UTC m=+629.193239509" watchObservedRunningTime="2026-04-22 16:09:11.902717714 +0000 UTC m=+629.194906437" Apr 22 16:09:17.925319 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:17.925289 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-qdc6q" Apr 22 16:09:21.560645 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.560605 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-2nscw"] Apr 22 16:09:21.567477 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.567447 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-2nscw" Apr 22 16:09:21.569836 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.569807 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-bpltj\"" Apr 22 16:09:21.573726 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.573696 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-2nscw"] Apr 22 16:09:21.706181 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.706141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8wc7\" (UniqueName: \"kubernetes.io/projected/149d7097-4880-4bfd-9ff5-6a169fb09e96-kube-api-access-c8wc7\") pod \"maas-controller-6d4c8f55f9-2nscw\" (UID: \"149d7097-4880-4bfd-9ff5-6a169fb09e96\") " pod="opendatahub/maas-controller-6d4c8f55f9-2nscw" Apr 22 16:09:21.711906 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.711867 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-67d785bbc6-pdplx"] Apr 22 16:09:21.741622 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.741574 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-67d785bbc6-pdplx"] Apr 22 16:09:21.741824 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.741738 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67d785bbc6-pdplx" Apr 22 16:09:21.807748 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.807704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8wc7\" (UniqueName: \"kubernetes.io/projected/149d7097-4880-4bfd-9ff5-6a169fb09e96-kube-api-access-c8wc7\") pod \"maas-controller-6d4c8f55f9-2nscw\" (UID: \"149d7097-4880-4bfd-9ff5-6a169fb09e96\") " pod="opendatahub/maas-controller-6d4c8f55f9-2nscw" Apr 22 16:09:21.815651 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.815583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8wc7\" (UniqueName: \"kubernetes.io/projected/149d7097-4880-4bfd-9ff5-6a169fb09e96-kube-api-access-c8wc7\") pod \"maas-controller-6d4c8f55f9-2nscw\" (UID: \"149d7097-4880-4bfd-9ff5-6a169fb09e96\") " pod="opendatahub/maas-controller-6d4c8f55f9-2nscw" Apr 22 16:09:21.825184 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.825148 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-67d785bbc6-pdplx"] Apr 22 16:09:21.825489 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:09:21.825469 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-l5lvv], unattached volumes=[], failed to process volumes=[kube-api-access-l5lvv]: context canceled" pod="opendatahub/maas-controller-67d785bbc6-pdplx" podUID="71a4f4ff-9d15-44d1-a61a-30ccf4f56c2a" Apr 22 16:09:21.848941 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.848897 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-db8bc5f4-f9tvl"] Apr 22 16:09:21.855973 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.855941 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-db8bc5f4-f9tvl" Apr 22 16:09:21.861774 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.861740 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-db8bc5f4-f9tvl"] Apr 22 16:09:21.879799 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.879756 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-2nscw" Apr 22 16:09:21.909070 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.908987 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5lvv\" (UniqueName: \"kubernetes.io/projected/71a4f4ff-9d15-44d1-a61a-30ccf4f56c2a-kube-api-access-l5lvv\") pod \"maas-controller-67d785bbc6-pdplx\" (UID: \"71a4f4ff-9d15-44d1-a61a-30ccf4f56c2a\") " pod="opendatahub/maas-controller-67d785bbc6-pdplx" Apr 22 16:09:21.931668 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.931636 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67d785bbc6-pdplx" Apr 22 16:09:21.939079 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:21.939051 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67d785bbc6-pdplx" Apr 22 16:09:22.011301 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:22.010723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5lvv\" (UniqueName: \"kubernetes.io/projected/71a4f4ff-9d15-44d1-a61a-30ccf4f56c2a-kube-api-access-l5lvv\") pod \"maas-controller-67d785bbc6-pdplx\" (UID: \"71a4f4ff-9d15-44d1-a61a-30ccf4f56c2a\") " pod="opendatahub/maas-controller-67d785bbc6-pdplx" Apr 22 16:09:22.011301 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:22.010783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9ghl\" (UniqueName: \"kubernetes.io/projected/9b876392-7d97-4214-a521-5dcca7efc2d9-kube-api-access-t9ghl\") pod \"maas-controller-db8bc5f4-f9tvl\" (UID: \"9b876392-7d97-4214-a521-5dcca7efc2d9\") " pod="opendatahub/maas-controller-db8bc5f4-f9tvl" Apr 22 16:09:22.035732 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:22.035662 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5lvv\" (UniqueName: \"kubernetes.io/projected/71a4f4ff-9d15-44d1-a61a-30ccf4f56c2a-kube-api-access-l5lvv\") pod \"maas-controller-67d785bbc6-pdplx\" (UID: \"71a4f4ff-9d15-44d1-a61a-30ccf4f56c2a\") " pod="opendatahub/maas-controller-67d785bbc6-pdplx" Apr 22 16:09:22.059802 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:22.059764 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-2nscw"] Apr 22 16:09:22.064080 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:09:22.064043 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod149d7097_4880_4bfd_9ff5_6a169fb09e96.slice/crio-d9c9cc2e31a702e7e34071f782fb2cde26651b1554200477534334c1b662e2dd WatchSource:0}: Error finding container d9c9cc2e31a702e7e34071f782fb2cde26651b1554200477534334c1b662e2dd: Status 404 returned error can't find the container with id d9c9cc2e31a702e7e34071f782fb2cde26651b1554200477534334c1b662e2dd Apr 22 16:09:22.119514 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:22.119466 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5lvv\" (UniqueName: \"kubernetes.io/projected/71a4f4ff-9d15-44d1-a61a-30ccf4f56c2a-kube-api-access-l5lvv\") pod \"71a4f4ff-9d15-44d1-a61a-30ccf4f56c2a\" (UID: \"71a4f4ff-9d15-44d1-a61a-30ccf4f56c2a\") " Apr 22 16:09:22.120148 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:22.119775 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9ghl\" (UniqueName: \"kubernetes.io/projected/9b876392-7d97-4214-a521-5dcca7efc2d9-kube-api-access-t9ghl\") pod \"maas-controller-db8bc5f4-f9tvl\" (UID: \"9b876392-7d97-4214-a521-5dcca7efc2d9\") " pod="opendatahub/maas-controller-db8bc5f4-f9tvl" Apr 22 16:09:22.122705 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:22.122663 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a4f4ff-9d15-44d1-a61a-30ccf4f56c2a-kube-api-access-l5lvv" (OuterVolumeSpecName: "kube-api-access-l5lvv") pod "71a4f4ff-9d15-44d1-a61a-30ccf4f56c2a" (UID: "71a4f4ff-9d15-44d1-a61a-30ccf4f56c2a"). InnerVolumeSpecName "kube-api-access-l5lvv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:09:22.128246 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:22.128212 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9ghl\" (UniqueName: \"kubernetes.io/projected/9b876392-7d97-4214-a521-5dcca7efc2d9-kube-api-access-t9ghl\") pod \"maas-controller-db8bc5f4-f9tvl\" (UID: \"9b876392-7d97-4214-a521-5dcca7efc2d9\") " pod="opendatahub/maas-controller-db8bc5f4-f9tvl" Apr 22 16:09:22.169599 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:22.169559 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-db8bc5f4-f9tvl" Apr 22 16:09:22.221149 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:22.221115 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l5lvv\" (UniqueName: \"kubernetes.io/projected/71a4f4ff-9d15-44d1-a61a-30ccf4f56c2a-kube-api-access-l5lvv\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:09:22.305449 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:22.305421 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-db8bc5f4-f9tvl"] Apr 22 16:09:22.307935 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:09:22.307897 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b876392_7d97_4214_a521_5dcca7efc2d9.slice/crio-73b88cb2fcb012389a0c66de5696d96a5ed6e30872ac0841273516024119224d WatchSource:0}: Error finding container 73b88cb2fcb012389a0c66de5696d96a5ed6e30872ac0841273516024119224d: Status 404 returned error can't find the container with id 73b88cb2fcb012389a0c66de5696d96a5ed6e30872ac0841273516024119224d Apr 22 16:09:22.939402 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:22.939353 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-db8bc5f4-f9tvl" event={"ID":"9b876392-7d97-4214-a521-5dcca7efc2d9","Type":"ContainerStarted","Data":"73b88cb2fcb012389a0c66de5696d96a5ed6e30872ac0841273516024119224d"} Apr 22 16:09:22.940758 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:22.940725 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-2nscw" event={"ID":"149d7097-4880-4bfd-9ff5-6a169fb09e96","Type":"ContainerStarted","Data":"d9c9cc2e31a702e7e34071f782fb2cde26651b1554200477534334c1b662e2dd"} Apr 22 16:09:22.940904 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:22.940773 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67d785bbc6-pdplx" Apr 22 16:09:22.974342 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:22.974282 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-67d785bbc6-pdplx"] Apr 22 16:09:22.977796 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:22.977761 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-67d785bbc6-pdplx"] Apr 22 16:09:23.396583 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:23.396515 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a4f4ff-9d15-44d1-a61a-30ccf4f56c2a" path="/var/lib/kubelet/pods/71a4f4ff-9d15-44d1-a61a-30ccf4f56c2a/volumes" Apr 22 16:09:25.957884 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:25.957835 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-2nscw" event={"ID":"149d7097-4880-4bfd-9ff5-6a169fb09e96","Type":"ContainerStarted","Data":"0ab23e04018a08a576daccf5d982fc1ab6bdcc4c30632e68326ae542979e8a11"} Apr 22 16:09:25.958387 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:25.957954 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-2nscw" Apr 22 16:09:25.959827 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:25.959791 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-db8bc5f4-f9tvl" event={"ID":"9b876392-7d97-4214-a521-5dcca7efc2d9","Type":"ContainerStarted","Data":"efb3c627e1739e330f1eae22314a9f54201f6778ae63867aedc7494d64584dd4"} Apr 22 16:09:25.959999 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:25.959933 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-db8bc5f4-f9tvl" Apr 22 16:09:25.978395 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:25.978334 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-2nscw" podStartSLOduration=1.407831466 podStartE2EDuration="4.978316149s" podCreationTimestamp="2026-04-22 16:09:21 +0000 UTC" firstStartedPulling="2026-04-22 16:09:22.065522 +0000 UTC m=+639.357710703" lastFinishedPulling="2026-04-22 16:09:25.636006669 +0000 UTC m=+642.928195386" observedRunningTime="2026-04-22 16:09:25.976157504 +0000 UTC m=+643.268346225" watchObservedRunningTime="2026-04-22 16:09:25.978316149 +0000 UTC m=+643.270504873" Apr 22 16:09:25.994035 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:25.993983 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-db8bc5f4-f9tvl" podStartSLOduration=1.665881811 podStartE2EDuration="4.993963373s" podCreationTimestamp="2026-04-22 16:09:21 +0000 UTC" firstStartedPulling="2026-04-22 16:09:22.309400959 +0000 UTC m=+639.601589661" lastFinishedPulling="2026-04-22 16:09:25.63748252 +0000 UTC m=+642.929671223" observedRunningTime="2026-04-22 16:09:25.992619635 +0000 UTC m=+643.284808360" watchObservedRunningTime="2026-04-22 16:09:25.993963373 +0000 UTC m=+643.286152143" Apr 22 16:09:27.945975 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:27.945929 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-845b4fbcb9-d2wl4"] Apr 22 16:09:27.948641 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:27.948610 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-845b4fbcb9-d2wl4" Apr 22 16:09:27.950640 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:27.950607 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 22 16:09:27.950805 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:27.950608 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 22 16:09:27.950805 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:27.950673 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-fm6kc\"" Apr 22 16:09:27.958497 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:27.958456 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-845b4fbcb9-d2wl4"] Apr 22 16:09:28.076022 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:28.075977 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/727b1ad3-4621-4684-8d76-6fadcc15b914-maas-api-tls\") pod \"maas-api-845b4fbcb9-d2wl4\" (UID: \"727b1ad3-4621-4684-8d76-6fadcc15b914\") " pod="opendatahub/maas-api-845b4fbcb9-d2wl4" Apr 22 16:09:28.076207 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:28.076116 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr9rr\" (UniqueName: \"kubernetes.io/projected/727b1ad3-4621-4684-8d76-6fadcc15b914-kube-api-access-gr9rr\") pod \"maas-api-845b4fbcb9-d2wl4\" (UID: \"727b1ad3-4621-4684-8d76-6fadcc15b914\") " pod="opendatahub/maas-api-845b4fbcb9-d2wl4" Apr 22 16:09:28.177573 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:28.177500 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gr9rr\" (UniqueName: \"kubernetes.io/projected/727b1ad3-4621-4684-8d76-6fadcc15b914-kube-api-access-gr9rr\") pod \"maas-api-845b4fbcb9-d2wl4\" (UID: \"727b1ad3-4621-4684-8d76-6fadcc15b914\") " pod="opendatahub/maas-api-845b4fbcb9-d2wl4" Apr 22 16:09:28.177764 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:28.177656 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/727b1ad3-4621-4684-8d76-6fadcc15b914-maas-api-tls\") pod \"maas-api-845b4fbcb9-d2wl4\" (UID: \"727b1ad3-4621-4684-8d76-6fadcc15b914\") " pod="opendatahub/maas-api-845b4fbcb9-d2wl4" Apr 22 16:09:28.177816 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:09:28.177793 2576 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 22 16:09:28.177876 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:09:28.177865 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/727b1ad3-4621-4684-8d76-6fadcc15b914-maas-api-tls podName:727b1ad3-4621-4684-8d76-6fadcc15b914 nodeName:}" failed. No retries permitted until 2026-04-22 16:09:28.677848504 +0000 UTC m=+645.970037210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/727b1ad3-4621-4684-8d76-6fadcc15b914-maas-api-tls") pod "maas-api-845b4fbcb9-d2wl4" (UID: "727b1ad3-4621-4684-8d76-6fadcc15b914") : secret "maas-api-serving-cert" not found Apr 22 16:09:28.190098 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:28.190059 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr9rr\" (UniqueName: \"kubernetes.io/projected/727b1ad3-4621-4684-8d76-6fadcc15b914-kube-api-access-gr9rr\") pod \"maas-api-845b4fbcb9-d2wl4\" (UID: \"727b1ad3-4621-4684-8d76-6fadcc15b914\") " pod="opendatahub/maas-api-845b4fbcb9-d2wl4" Apr 22 16:09:28.682695 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:28.682643 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/727b1ad3-4621-4684-8d76-6fadcc15b914-maas-api-tls\") pod \"maas-api-845b4fbcb9-d2wl4\" (UID: \"727b1ad3-4621-4684-8d76-6fadcc15b914\") " pod="opendatahub/maas-api-845b4fbcb9-d2wl4" Apr 22 16:09:28.685521 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:28.685479 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/727b1ad3-4621-4684-8d76-6fadcc15b914-maas-api-tls\") pod \"maas-api-845b4fbcb9-d2wl4\" (UID: \"727b1ad3-4621-4684-8d76-6fadcc15b914\") " pod="opendatahub/maas-api-845b4fbcb9-d2wl4" Apr 22 16:09:28.862656 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:28.862594 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-845b4fbcb9-d2wl4" Apr 22 16:09:29.001407 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:29.001378 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-845b4fbcb9-d2wl4"] Apr 22 16:09:29.005171 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:09:29.005133 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod727b1ad3_4621_4684_8d76_6fadcc15b914.slice/crio-a59570537a32b5f10423fb61dd26870282687dc146b672bf747714e5ba29e800 WatchSource:0}: Error finding container a59570537a32b5f10423fb61dd26870282687dc146b672bf747714e5ba29e800: Status 404 returned error can't find the container with id a59570537a32b5f10423fb61dd26870282687dc146b672bf747714e5ba29e800 Apr 22 16:09:29.978057 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:29.978022 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-845b4fbcb9-d2wl4" event={"ID":"727b1ad3-4621-4684-8d76-6fadcc15b914","Type":"ContainerStarted","Data":"a59570537a32b5f10423fb61dd26870282687dc146b672bf747714e5ba29e800"} Apr 22 16:09:30.983479 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:30.983445 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-845b4fbcb9-d2wl4" event={"ID":"727b1ad3-4621-4684-8d76-6fadcc15b914","Type":"ContainerStarted","Data":"eb05dc6baa306a132c8a2c91f7595a184b0003d8af9c891c3648b6fbc2ccf885"} Apr 22 16:09:30.983479 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:30.983487 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-845b4fbcb9-d2wl4" Apr 22 16:09:30.999006 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:30.998945 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-845b4fbcb9-d2wl4" podStartSLOduration=2.697960459 podStartE2EDuration="3.998924485s" podCreationTimestamp="2026-04-22 16:09:27 +0000 UTC" firstStartedPulling="2026-04-22 16:09:29.006690652 +0000 UTC m=+646.298879353" lastFinishedPulling="2026-04-22 16:09:30.307654678 +0000 UTC m=+647.599843379" observedRunningTime="2026-04-22 16:09:30.997153688 +0000 UTC m=+648.289342436" watchObservedRunningTime="2026-04-22 16:09:30.998924485 +0000 UTC m=+648.291113208" Apr 22 16:09:36.971272 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:36.971231 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d4c8f55f9-2nscw" Apr 22 16:09:36.971916 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:36.971291 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-db8bc5f4-f9tvl" Apr 22 16:09:36.992380 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:36.992349 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-845b4fbcb9-d2wl4" Apr 22 16:09:37.022744 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.022702 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-2nscw"] Apr 22 16:09:37.023040 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.022983 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-2nscw" podUID="149d7097-4880-4bfd-9ff5-6a169fb09e96" containerName="manager" containerID="cri-o://0ab23e04018a08a576daccf5d982fc1ab6bdcc4c30632e68326ae542979e8a11" gracePeriod=10 Apr 22 16:09:37.271714 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.271689 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-2nscw" Apr 22 16:09:37.297772 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.297731 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-654cfbcb98-m54wm"] Apr 22 16:09:37.298555 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.298279 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="149d7097-4880-4bfd-9ff5-6a169fb09e96" containerName="manager" Apr 22 16:09:37.298555 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.298304 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="149d7097-4880-4bfd-9ff5-6a169fb09e96" containerName="manager" Apr 22 16:09:37.298555 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.298423 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="149d7097-4880-4bfd-9ff5-6a169fb09e96" containerName="manager" Apr 22 16:09:37.300838 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.300814 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-654cfbcb98-m54wm" Apr 22 16:09:37.308936 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.308864 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-654cfbcb98-m54wm"] Apr 22 16:09:37.361117 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.361077 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxkdx\" (UniqueName: \"kubernetes.io/projected/82599c78-2ffb-4c8e-8b07-7de3b91ef416-kube-api-access-mxkdx\") pod \"maas-controller-654cfbcb98-m54wm\" (UID: \"82599c78-2ffb-4c8e-8b07-7de3b91ef416\") " pod="opendatahub/maas-controller-654cfbcb98-m54wm" Apr 22 16:09:37.462421 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.462380 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8wc7\" (UniqueName: \"kubernetes.io/projected/149d7097-4880-4bfd-9ff5-6a169fb09e96-kube-api-access-c8wc7\") pod \"149d7097-4880-4bfd-9ff5-6a169fb09e96\" (UID: \"149d7097-4880-4bfd-9ff5-6a169fb09e96\") " Apr 22 16:09:37.462657 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.462495 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxkdx\" (UniqueName: \"kubernetes.io/projected/82599c78-2ffb-4c8e-8b07-7de3b91ef416-kube-api-access-mxkdx\") pod \"maas-controller-654cfbcb98-m54wm\" (UID: \"82599c78-2ffb-4c8e-8b07-7de3b91ef416\") " pod="opendatahub/maas-controller-654cfbcb98-m54wm" Apr 22 16:09:37.464902 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.464873 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149d7097-4880-4bfd-9ff5-6a169fb09e96-kube-api-access-c8wc7" (OuterVolumeSpecName: "kube-api-access-c8wc7") pod "149d7097-4880-4bfd-9ff5-6a169fb09e96" (UID: "149d7097-4880-4bfd-9ff5-6a169fb09e96"). InnerVolumeSpecName "kube-api-access-c8wc7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:09:37.471495 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.471468 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxkdx\" (UniqueName: \"kubernetes.io/projected/82599c78-2ffb-4c8e-8b07-7de3b91ef416-kube-api-access-mxkdx\") pod \"maas-controller-654cfbcb98-m54wm\" (UID: \"82599c78-2ffb-4c8e-8b07-7de3b91ef416\") " pod="opendatahub/maas-controller-654cfbcb98-m54wm" Apr 22 16:09:37.563496 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.563398 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c8wc7\" (UniqueName: \"kubernetes.io/projected/149d7097-4880-4bfd-9ff5-6a169fb09e96-kube-api-access-c8wc7\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:09:37.613980 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.613931 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-654cfbcb98-m54wm" Apr 22 16:09:37.748879 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.748853 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-654cfbcb98-m54wm"] Apr 22 16:09:37.750605 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:09:37.750567 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82599c78_2ffb_4c8e_8b07_7de3b91ef416.slice/crio-d7c542de7e75f8ecd88314ba07decaec856be2eadcac15787b267cddf2c6be66 WatchSource:0}: Error finding container d7c542de7e75f8ecd88314ba07decaec856be2eadcac15787b267cddf2c6be66: Status 404 returned error can't find the container with id d7c542de7e75f8ecd88314ba07decaec856be2eadcac15787b267cddf2c6be66 Apr 22 16:09:37.751932 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:37.751908 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:09:38.011146 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:38.011106 2576 generic.go:358] "Generic (PLEG): container finished" podID="149d7097-4880-4bfd-9ff5-6a169fb09e96" containerID="0ab23e04018a08a576daccf5d982fc1ab6bdcc4c30632e68326ae542979e8a11" exitCode=0 Apr 22 16:09:38.011629 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:38.011183 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-2nscw" Apr 22 16:09:38.011629 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:38.011211 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-2nscw" event={"ID":"149d7097-4880-4bfd-9ff5-6a169fb09e96","Type":"ContainerDied","Data":"0ab23e04018a08a576daccf5d982fc1ab6bdcc4c30632e68326ae542979e8a11"} Apr 22 16:09:38.011629 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:38.011260 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-2nscw" event={"ID":"149d7097-4880-4bfd-9ff5-6a169fb09e96","Type":"ContainerDied","Data":"d9c9cc2e31a702e7e34071f782fb2cde26651b1554200477534334c1b662e2dd"} Apr 22 16:09:38.011629 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:38.011281 2576 scope.go:117] "RemoveContainer" containerID="0ab23e04018a08a576daccf5d982fc1ab6bdcc4c30632e68326ae542979e8a11" Apr 22 16:09:38.012579 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:38.012553 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-654cfbcb98-m54wm" event={"ID":"82599c78-2ffb-4c8e-8b07-7de3b91ef416","Type":"ContainerStarted","Data":"d7c542de7e75f8ecd88314ba07decaec856be2eadcac15787b267cddf2c6be66"} Apr 22 16:09:38.020930 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:38.020907 2576 scope.go:117] "RemoveContainer" containerID="0ab23e04018a08a576daccf5d982fc1ab6bdcc4c30632e68326ae542979e8a11" Apr 22 16:09:38.021236 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:09:38.021216 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ab23e04018a08a576daccf5d982fc1ab6bdcc4c30632e68326ae542979e8a11\": container with ID starting with 0ab23e04018a08a576daccf5d982fc1ab6bdcc4c30632e68326ae542979e8a11 not found: ID does not exist" containerID="0ab23e04018a08a576daccf5d982fc1ab6bdcc4c30632e68326ae542979e8a11" Apr 22 16:09:38.021302 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:38.021246 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab23e04018a08a576daccf5d982fc1ab6bdcc4c30632e68326ae542979e8a11"} err="failed to get container status \"0ab23e04018a08a576daccf5d982fc1ab6bdcc4c30632e68326ae542979e8a11\": rpc error: code = NotFound desc = could not find container \"0ab23e04018a08a576daccf5d982fc1ab6bdcc4c30632e68326ae542979e8a11\": container with ID starting with 0ab23e04018a08a576daccf5d982fc1ab6bdcc4c30632e68326ae542979e8a11 not found: ID does not exist" Apr 22 16:09:38.033570 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:38.033462 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-2nscw"] Apr 22 16:09:38.035718 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:38.035688 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-2nscw"] Apr 22 16:09:39.019546 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:39.019482 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-654cfbcb98-m54wm" event={"ID":"82599c78-2ffb-4c8e-8b07-7de3b91ef416","Type":"ContainerStarted","Data":"ab19450dd12ae85d6ad907fdb7082a429d3ad3c0faa6f07a888966449296fbd9"} Apr 22 16:09:39.020015 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:39.019640 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-654cfbcb98-m54wm" Apr 22 16:09:39.036040 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:39.035977 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-654cfbcb98-m54wm" podStartSLOduration=1.654520412 podStartE2EDuration="2.035961316s" podCreationTimestamp="2026-04-22 16:09:37 +0000 UTC" firstStartedPulling="2026-04-22 16:09:37.752051781 +0000 UTC m=+655.044240483" lastFinishedPulling="2026-04-22 16:09:38.133492683 +0000 UTC m=+655.425681387" observedRunningTime="2026-04-22 16:09:39.033708391 +0000 UTC m=+656.325897115" watchObservedRunningTime="2026-04-22 16:09:39.035961316 +0000 UTC m=+656.328150041" Apr 22 16:09:39.394185 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:39.394099 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="149d7097-4880-4bfd-9ff5-6a169fb09e96" path="/var/lib/kubelet/pods/149d7097-4880-4bfd-9ff5-6a169fb09e96/volumes" Apr 22 16:09:50.028545 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:50.028496 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-654cfbcb98-m54wm" Apr 22 16:09:50.066894 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:50.066853 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-db8bc5f4-f9tvl"] Apr 22 16:09:50.067126 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:50.067088 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-db8bc5f4-f9tvl" podUID="9b876392-7d97-4214-a521-5dcca7efc2d9" containerName="manager" containerID="cri-o://efb3c627e1739e330f1eae22314a9f54201f6778ae63867aedc7494d64584dd4" gracePeriod=10 Apr 22 16:09:50.311656 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:50.311629 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-db8bc5f4-f9tvl" Apr 22 16:09:50.374614 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:50.374566 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9ghl\" (UniqueName: \"kubernetes.io/projected/9b876392-7d97-4214-a521-5dcca7efc2d9-kube-api-access-t9ghl\") pod \"9b876392-7d97-4214-a521-5dcca7efc2d9\" (UID: \"9b876392-7d97-4214-a521-5dcca7efc2d9\") " Apr 22 16:09:50.376889 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:50.376856 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b876392-7d97-4214-a521-5dcca7efc2d9-kube-api-access-t9ghl" (OuterVolumeSpecName: "kube-api-access-t9ghl") pod "9b876392-7d97-4214-a521-5dcca7efc2d9" (UID: "9b876392-7d97-4214-a521-5dcca7efc2d9"). InnerVolumeSpecName "kube-api-access-t9ghl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:09:50.475698 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:50.475658 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t9ghl\" (UniqueName: \"kubernetes.io/projected/9b876392-7d97-4214-a521-5dcca7efc2d9-kube-api-access-t9ghl\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:09:51.067393 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:51.067357 2576 generic.go:358] "Generic (PLEG): container finished" podID="9b876392-7d97-4214-a521-5dcca7efc2d9" containerID="efb3c627e1739e330f1eae22314a9f54201f6778ae63867aedc7494d64584dd4" exitCode=0 Apr 22 16:09:51.067825 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:51.067425 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-db8bc5f4-f9tvl" Apr 22 16:09:51.067825 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:51.067436 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-db8bc5f4-f9tvl" event={"ID":"9b876392-7d97-4214-a521-5dcca7efc2d9","Type":"ContainerDied","Data":"efb3c627e1739e330f1eae22314a9f54201f6778ae63867aedc7494d64584dd4"} Apr 22 16:09:51.067825 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:51.067469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-db8bc5f4-f9tvl" event={"ID":"9b876392-7d97-4214-a521-5dcca7efc2d9","Type":"ContainerDied","Data":"73b88cb2fcb012389a0c66de5696d96a5ed6e30872ac0841273516024119224d"} Apr 22 16:09:51.067825 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:51.067483 2576 scope.go:117] "RemoveContainer" containerID="efb3c627e1739e330f1eae22314a9f54201f6778ae63867aedc7494d64584dd4" Apr 22 16:09:51.076693 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:51.076672 2576 scope.go:117] "RemoveContainer" containerID="efb3c627e1739e330f1eae22314a9f54201f6778ae63867aedc7494d64584dd4" Apr 22 16:09:51.077073 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:09:51.077053 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efb3c627e1739e330f1eae22314a9f54201f6778ae63867aedc7494d64584dd4\": container with ID starting with efb3c627e1739e330f1eae22314a9f54201f6778ae63867aedc7494d64584dd4 not found: ID does not exist" containerID="efb3c627e1739e330f1eae22314a9f54201f6778ae63867aedc7494d64584dd4" Apr 22 16:09:51.077152 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:51.077083 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb3c627e1739e330f1eae22314a9f54201f6778ae63867aedc7494d64584dd4"} err="failed to get container status \"efb3c627e1739e330f1eae22314a9f54201f6778ae63867aedc7494d64584dd4\": rpc error: code = NotFound desc = could not find container \"efb3c627e1739e330f1eae22314a9f54201f6778ae63867aedc7494d64584dd4\": container with ID starting with efb3c627e1739e330f1eae22314a9f54201f6778ae63867aedc7494d64584dd4 not found: ID does not exist" Apr 22 16:09:51.088498 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:51.088464 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-db8bc5f4-f9tvl"] Apr 22 16:09:51.092823 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:51.092797 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-db8bc5f4-f9tvl"] Apr 22 16:09:51.392965 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:51.392880 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b876392-7d97-4214-a521-5dcca7efc2d9" path="/var/lib/kubelet/pods/9b876392-7d97-4214-a521-5dcca7efc2d9/volumes" Apr 22 16:09:58.572114 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:58.572075 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-564bc75589-tmk9g"] Apr 22 16:09:58.572587 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:58.572571 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b876392-7d97-4214-a521-5dcca7efc2d9" containerName="manager" Apr 22 16:09:58.572641 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:58.572589 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b876392-7d97-4214-a521-5dcca7efc2d9" containerName="manager" Apr 22 16:09:58.572676 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:58.572660 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b876392-7d97-4214-a521-5dcca7efc2d9" containerName="manager" Apr 22 16:09:58.582822 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:58.582780 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-564bc75589-tmk9g" Apr 22 16:09:58.587487 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:58.587429 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-564bc75589-tmk9g"] Apr 22 16:09:58.645508 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:58.645467 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hx2t\" (UniqueName: \"kubernetes.io/projected/6156f868-633a-43aa-a205-4e0ce2b2d6d8-kube-api-access-5hx2t\") pod \"maas-api-564bc75589-tmk9g\" (UID: \"6156f868-633a-43aa-a205-4e0ce2b2d6d8\") " pod="opendatahub/maas-api-564bc75589-tmk9g" Apr 22 16:09:58.645709 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:58.645553 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/6156f868-633a-43aa-a205-4e0ce2b2d6d8-maas-api-tls\") pod \"maas-api-564bc75589-tmk9g\" (UID: \"6156f868-633a-43aa-a205-4e0ce2b2d6d8\") " pod="opendatahub/maas-api-564bc75589-tmk9g" Apr 22 16:09:58.746145 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:58.746105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/6156f868-633a-43aa-a205-4e0ce2b2d6d8-maas-api-tls\") pod \"maas-api-564bc75589-tmk9g\" (UID: \"6156f868-633a-43aa-a205-4e0ce2b2d6d8\") " pod="opendatahub/maas-api-564bc75589-tmk9g" Apr 22 16:09:58.746340 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:58.746228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hx2t\" (UniqueName: \"kubernetes.io/projected/6156f868-633a-43aa-a205-4e0ce2b2d6d8-kube-api-access-5hx2t\") pod \"maas-api-564bc75589-tmk9g\" (UID: \"6156f868-633a-43aa-a205-4e0ce2b2d6d8\") " pod="opendatahub/maas-api-564bc75589-tmk9g" Apr 22 16:09:58.748859 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:58.748825 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/6156f868-633a-43aa-a205-4e0ce2b2d6d8-maas-api-tls\") pod \"maas-api-564bc75589-tmk9g\" (UID: \"6156f868-633a-43aa-a205-4e0ce2b2d6d8\") " pod="opendatahub/maas-api-564bc75589-tmk9g" Apr 22 16:09:58.755289 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:58.755262 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hx2t\" (UniqueName: \"kubernetes.io/projected/6156f868-633a-43aa-a205-4e0ce2b2d6d8-kube-api-access-5hx2t\") pod \"maas-api-564bc75589-tmk9g\" (UID: \"6156f868-633a-43aa-a205-4e0ce2b2d6d8\") " pod="opendatahub/maas-api-564bc75589-tmk9g" Apr 22 16:09:58.897811 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:58.897713 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-564bc75589-tmk9g" Apr 22 16:09:59.039867 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:59.039836 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-564bc75589-tmk9g"] Apr 22 16:09:59.042308 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:09:59.042278 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6156f868_633a_43aa_a205_4e0ce2b2d6d8.slice/crio-5cb79b94aff3a0858ae09de1389a59d648ea7df482be16ca1910ef2058bba0c7 WatchSource:0}: Error finding container 5cb79b94aff3a0858ae09de1389a59d648ea7df482be16ca1910ef2058bba0c7: Status 404 returned error can't find the container with id 5cb79b94aff3a0858ae09de1389a59d648ea7df482be16ca1910ef2058bba0c7 Apr 22 16:09:59.100720 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:09:59.100678 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-564bc75589-tmk9g" event={"ID":"6156f868-633a-43aa-a205-4e0ce2b2d6d8","Type":"ContainerStarted","Data":"5cb79b94aff3a0858ae09de1389a59d648ea7df482be16ca1910ef2058bba0c7"} Apr 22 16:10:01.109548 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:01.109422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-564bc75589-tmk9g" event={"ID":"6156f868-633a-43aa-a205-4e0ce2b2d6d8","Type":"ContainerStarted","Data":"d99262b50036df7f3b43c38cb6f5985a5f5da228e5946c9e239d489b1d107a47"} Apr 22 16:10:01.109548 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:01.109497 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-564bc75589-tmk9g" Apr 22 16:10:01.125939 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:01.125883 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-564bc75589-tmk9g" podStartSLOduration=1.4400361990000001 podStartE2EDuration="3.125866904s" podCreationTimestamp="2026-04-22 16:09:58 +0000 UTC" firstStartedPulling="2026-04-22 16:09:59.043858371 +0000 UTC m=+676.336047073" lastFinishedPulling="2026-04-22 16:10:00.729689072 +0000 UTC m=+678.021877778" observedRunningTime="2026-04-22 16:10:01.124710756 +0000 UTC m=+678.416899493" watchObservedRunningTime="2026-04-22 16:10:01.125866904 +0000 UTC m=+678.418055628" Apr 22 16:10:07.120746 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:07.120709 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-564bc75589-tmk9g" Apr 22 16:10:07.163332 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:07.163299 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-845b4fbcb9-d2wl4"] Apr 22 16:10:07.163593 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:07.163570 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-845b4fbcb9-d2wl4" podUID="727b1ad3-4621-4684-8d76-6fadcc15b914" containerName="maas-api" containerID="cri-o://eb05dc6baa306a132c8a2c91f7595a184b0003d8af9c891c3648b6fbc2ccf885" gracePeriod=30 Apr 22 16:10:07.436434 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:07.436407 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-845b4fbcb9-d2wl4" Apr 22 16:10:07.527701 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:07.527661 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr9rr\" (UniqueName: \"kubernetes.io/projected/727b1ad3-4621-4684-8d76-6fadcc15b914-kube-api-access-gr9rr\") pod \"727b1ad3-4621-4684-8d76-6fadcc15b914\" (UID: \"727b1ad3-4621-4684-8d76-6fadcc15b914\") " Apr 22 16:10:07.527880 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:07.527742 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/727b1ad3-4621-4684-8d76-6fadcc15b914-maas-api-tls\") pod \"727b1ad3-4621-4684-8d76-6fadcc15b914\" (UID: \"727b1ad3-4621-4684-8d76-6fadcc15b914\") " Apr 22 16:10:07.530178 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:07.530139 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/727b1ad3-4621-4684-8d76-6fadcc15b914-kube-api-access-gr9rr" (OuterVolumeSpecName: "kube-api-access-gr9rr") pod "727b1ad3-4621-4684-8d76-6fadcc15b914" (UID: "727b1ad3-4621-4684-8d76-6fadcc15b914"). InnerVolumeSpecName "kube-api-access-gr9rr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:10:07.530178 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:07.530153 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/727b1ad3-4621-4684-8d76-6fadcc15b914-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "727b1ad3-4621-4684-8d76-6fadcc15b914" (UID: "727b1ad3-4621-4684-8d76-6fadcc15b914"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:10:07.629316 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:07.629277 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gr9rr\" (UniqueName: \"kubernetes.io/projected/727b1ad3-4621-4684-8d76-6fadcc15b914-kube-api-access-gr9rr\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:10:07.629316 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:07.629312 2576 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/727b1ad3-4621-4684-8d76-6fadcc15b914-maas-api-tls\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 22 16:10:08.139464 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:08.139428 2576 generic.go:358] "Generic (PLEG): container finished" podID="727b1ad3-4621-4684-8d76-6fadcc15b914" containerID="eb05dc6baa306a132c8a2c91f7595a184b0003d8af9c891c3648b6fbc2ccf885" exitCode=0 Apr 22 16:10:08.140068 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:08.139474 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-845b4fbcb9-d2wl4" event={"ID":"727b1ad3-4621-4684-8d76-6fadcc15b914","Type":"ContainerDied","Data":"eb05dc6baa306a132c8a2c91f7595a184b0003d8af9c891c3648b6fbc2ccf885"} Apr 22 16:10:08.140068 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:08.139491 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-845b4fbcb9-d2wl4" Apr 22 16:10:08.140068 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:08.139522 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-845b4fbcb9-d2wl4" event={"ID":"727b1ad3-4621-4684-8d76-6fadcc15b914","Type":"ContainerDied","Data":"a59570537a32b5f10423fb61dd26870282687dc146b672bf747714e5ba29e800"} Apr 22 16:10:08.140068 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:08.139577 2576 scope.go:117] "RemoveContainer" containerID="eb05dc6baa306a132c8a2c91f7595a184b0003d8af9c891c3648b6fbc2ccf885" Apr 22 16:10:08.149126 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:08.149109 2576 scope.go:117] "RemoveContainer" containerID="eb05dc6baa306a132c8a2c91f7595a184b0003d8af9c891c3648b6fbc2ccf885" Apr 22 16:10:08.149444 ip-10-0-132-57 kubenswrapper[2576]: E0422 16:10:08.149422 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb05dc6baa306a132c8a2c91f7595a184b0003d8af9c891c3648b6fbc2ccf885\": container with ID starting with eb05dc6baa306a132c8a2c91f7595a184b0003d8af9c891c3648b6fbc2ccf885 not found: ID does not exist" containerID="eb05dc6baa306a132c8a2c91f7595a184b0003d8af9c891c3648b6fbc2ccf885" Apr 22 16:10:08.149488 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:08.149456 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb05dc6baa306a132c8a2c91f7595a184b0003d8af9c891c3648b6fbc2ccf885"} err="failed to get container status \"eb05dc6baa306a132c8a2c91f7595a184b0003d8af9c891c3648b6fbc2ccf885\": rpc error: code = NotFound desc = could not find container \"eb05dc6baa306a132c8a2c91f7595a184b0003d8af9c891c3648b6fbc2ccf885\": container with ID starting with eb05dc6baa306a132c8a2c91f7595a184b0003d8af9c891c3648b6fbc2ccf885 not found: ID does not exist" Apr 22 16:10:08.160017 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:08.159982 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-845b4fbcb9-d2wl4"] Apr 22 16:10:08.163685 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:08.163653 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-845b4fbcb9-d2wl4"] Apr 22 16:10:09.392949 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:09.392906 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="727b1ad3-4621-4684-8d76-6fadcc15b914" path="/var/lib/kubelet/pods/727b1ad3-4621-4684-8d76-6fadcc15b914/volumes" Apr 22 16:10:13.952091 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:13.952049 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp"] Apr 22 16:10:13.952678 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:13.952657 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="727b1ad3-4621-4684-8d76-6fadcc15b914" containerName="maas-api" Apr 22 16:10:13.952776 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:13.952681 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="727b1ad3-4621-4684-8d76-6fadcc15b914" containerName="maas-api" Apr 22 16:10:13.952776 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:13.952767 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="727b1ad3-4621-4684-8d76-6fadcc15b914" containerName="maas-api" Apr 22 16:10:13.957638 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:13.957610 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:13.959778 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:13.959730 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 22 16:10:13.959924 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:13.959848 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 22 16:10:13.960677 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:13.960660 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 22 16:10:13.960781 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:13.960675 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-k94wr\"" Apr 22 16:10:13.964579 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:13.964552 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp"] Apr 22 16:10:14.090320 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.090284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.090522 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.090332 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.090522 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.090367 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkzpq\" (UniqueName: \"kubernetes.io/projected/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-kube-api-access-tkzpq\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.090522 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.090416 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.090522 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.090548 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.090811 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.090590 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.191419 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.191370 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.191662 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.191435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkzpq\" (UniqueName: \"kubernetes.io/projected/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-kube-api-access-tkzpq\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.191662 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.191456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.191662 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.191518 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.191662 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.191561 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.191662 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.191598 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.191993 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.191961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.192067 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.192016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.192067 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.192030 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.194017 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.193994 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.194194 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.194175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.198634 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.198606 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkzpq\" (UniqueName: \"kubernetes.io/projected/05b98a06-4417-4bf3-ad6c-e14be54bf8f9-kube-api-access-tkzpq\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp\" (UID: \"05b98a06-4417-4bf3-ad6c-e14be54bf8f9\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.269771 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.269676 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:14.417744 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:14.417706 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp"] Apr 22 16:10:14.419555 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:10:14.419503 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05b98a06_4417_4bf3_ad6c_e14be54bf8f9.slice/crio-521eda26ef094e5e031623f44435f04f3211e4bcbd25f6e7c60488ad82c3e889 WatchSource:0}: Error finding container 521eda26ef094e5e031623f44435f04f3211e4bcbd25f6e7c60488ad82c3e889: Status 404 returned error can't find the container with id 521eda26ef094e5e031623f44435f04f3211e4bcbd25f6e7c60488ad82c3e889 Apr 22 16:10:15.169282 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:15.169228 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" event={"ID":"05b98a06-4417-4bf3-ad6c-e14be54bf8f9","Type":"ContainerStarted","Data":"521eda26ef094e5e031623f44435f04f3211e4bcbd25f6e7c60488ad82c3e889"} Apr 22 16:10:20.195397 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.195356 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" event={"ID":"05b98a06-4417-4bf3-ad6c-e14be54bf8f9","Type":"ContainerStarted","Data":"984b68c0400b54f8aeebec199d26a377ac3074b5b52cbaa4d6bcd8f41983e686"} Apr 22 16:10:20.247159 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.247116 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r"] Apr 22 16:10:20.251061 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.251028 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.253163 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.253138 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 22 16:10:20.259824 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.259792 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r"] Apr 22 16:10:20.358335 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.358289 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/383a13bb-93f8-45e4-8d06-d0257e8b8a94-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.358572 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.358355 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/383a13bb-93f8-45e4-8d06-d0257e8b8a94-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.358572 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.358423 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/383a13bb-93f8-45e4-8d06-d0257e8b8a94-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.358712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.358564 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/383a13bb-93f8-45e4-8d06-d0257e8b8a94-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.358712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.358616 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/383a13bb-93f8-45e4-8d06-d0257e8b8a94-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.358712 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.358646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tbkl\" (UniqueName: \"kubernetes.io/projected/383a13bb-93f8-45e4-8d06-d0257e8b8a94-kube-api-access-6tbkl\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.460298 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.460188 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/383a13bb-93f8-45e4-8d06-d0257e8b8a94-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.460473 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.460318 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/383a13bb-93f8-45e4-8d06-d0257e8b8a94-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.460473 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.460351 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/383a13bb-93f8-45e4-8d06-d0257e8b8a94-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.460473 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.460384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/383a13bb-93f8-45e4-8d06-d0257e8b8a94-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.460473 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.460409 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tbkl\" (UniqueName: \"kubernetes.io/projected/383a13bb-93f8-45e4-8d06-d0257e8b8a94-kube-api-access-6tbkl\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.460696 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.460562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/383a13bb-93f8-45e4-8d06-d0257e8b8a94-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.460696 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.460584 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/383a13bb-93f8-45e4-8d06-d0257e8b8a94-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.460867 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.460836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/383a13bb-93f8-45e4-8d06-d0257e8b8a94-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.460959 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.460936 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/383a13bb-93f8-45e4-8d06-d0257e8b8a94-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.463161 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.463137 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/383a13bb-93f8-45e4-8d06-d0257e8b8a94-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.463276 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.463259 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/383a13bb-93f8-45e4-8d06-d0257e8b8a94-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.467858 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.467826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tbkl\" (UniqueName: \"kubernetes.io/projected/383a13bb-93f8-45e4-8d06-d0257e8b8a94-kube-api-access-6tbkl\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r\" (UID: \"383a13bb-93f8-45e4-8d06-d0257e8b8a94\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.564943 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.564909 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:20.719571 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:20.719449 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r"] Apr 22 16:10:20.724331 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:10:20.724291 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod383a13bb_93f8_45e4_8d06_d0257e8b8a94.slice/crio-ee55e517824ceafff2a048425f94a82e4e79be00b15bc46691a134ca24be4516 WatchSource:0}: Error finding container ee55e517824ceafff2a048425f94a82e4e79be00b15bc46691a134ca24be4516: Status 404 returned error can't find the container with id ee55e517824ceafff2a048425f94a82e4e79be00b15bc46691a134ca24be4516 Apr 22 16:10:21.202062 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:21.202020 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" event={"ID":"383a13bb-93f8-45e4-8d06-d0257e8b8a94","Type":"ContainerStarted","Data":"e9eb29615fff664fae80548d228a3b95a949f1e940d09eaae284668cd3d6029c"} Apr 22 16:10:21.202062 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:21.202067 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" event={"ID":"383a13bb-93f8-45e4-8d06-d0257e8b8a94","Type":"ContainerStarted","Data":"ee55e517824ceafff2a048425f94a82e4e79be00b15bc46691a134ca24be4516"} Apr 22 16:10:26.224664 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:26.224573 2576 generic.go:358] "Generic (PLEG): container finished" podID="05b98a06-4417-4bf3-ad6c-e14be54bf8f9" containerID="984b68c0400b54f8aeebec199d26a377ac3074b5b52cbaa4d6bcd8f41983e686" exitCode=0 Apr 22 16:10:26.224664 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:26.224648 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" event={"ID":"05b98a06-4417-4bf3-ad6c-e14be54bf8f9","Type":"ContainerDied","Data":"984b68c0400b54f8aeebec199d26a377ac3074b5b52cbaa4d6bcd8f41983e686"} Apr 22 16:10:27.231233 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:27.231147 2576 generic.go:358] "Generic (PLEG): container finished" podID="383a13bb-93f8-45e4-8d06-d0257e8b8a94" containerID="e9eb29615fff664fae80548d228a3b95a949f1e940d09eaae284668cd3d6029c" exitCode=0 Apr 22 16:10:27.231233 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:27.231225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" event={"ID":"383a13bb-93f8-45e4-8d06-d0257e8b8a94","Type":"ContainerDied","Data":"e9eb29615fff664fae80548d228a3b95a949f1e940d09eaae284668cd3d6029c"} Apr 22 16:10:28.236943 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:28.236908 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" event={"ID":"383a13bb-93f8-45e4-8d06-d0257e8b8a94","Type":"ContainerStarted","Data":"da82a2b58bb390643eb2f38cbf2f6103e51833f7e1896ba5a3f6afa7653d41b9"} Apr 22 16:10:28.237385 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:28.237163 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:10:28.238759 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:28.238733 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" event={"ID":"05b98a06-4417-4bf3-ad6c-e14be54bf8f9","Type":"ContainerStarted","Data":"089179e901e9732b7dea1847a20897758c58686812bd93708fe1272606d8c901"} Apr 22 16:10:28.238944 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:28.238928 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:28.254012 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:28.253958 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" podStartSLOduration=7.908520193 podStartE2EDuration="8.25394124s" podCreationTimestamp="2026-04-22 16:10:20 +0000 UTC" firstStartedPulling="2026-04-22 16:10:27.231987654 +0000 UTC m=+704.524176357" lastFinishedPulling="2026-04-22 16:10:27.577408703 +0000 UTC m=+704.869597404" observedRunningTime="2026-04-22 16:10:28.252266805 +0000 UTC m=+705.544455528" watchObservedRunningTime="2026-04-22 16:10:28.25394124 +0000 UTC m=+705.546129963" Apr 22 16:10:28.269901 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:28.269835 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" podStartSLOduration=2.115472144 podStartE2EDuration="15.269819859s" podCreationTimestamp="2026-04-22 16:10:13 +0000 UTC" firstStartedPulling="2026-04-22 16:10:14.422278228 +0000 UTC m=+691.714466930" lastFinishedPulling="2026-04-22 16:10:27.576625941 +0000 UTC m=+704.868814645" observedRunningTime="2026-04-22 16:10:28.266831399 +0000 UTC m=+705.559020137" watchObservedRunningTime="2026-04-22 16:10:28.269819859 +0000 UTC m=+705.562008644" Apr 22 16:10:39.256156 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:39.256081 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp" Apr 22 16:10:39.256671 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:10:39.256649 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r" Apr 22 16:13:36.158664 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:36.158565 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-tvp8f_4f6f0783-eae9-4ae1-92ed-e4430af515bf/manager/0.log" Apr 22 16:13:36.268327 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:36.268275 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-564bc75589-tmk9g_6156f868-633a-43aa-a205-4e0ce2b2d6d8/maas-api/0.log" Apr 22 16:13:36.379385 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:36.379352 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-654cfbcb98-m54wm_82599c78-2ffb-4c8e-8b07-7de3b91ef416/manager/0.log" Apr 22 16:13:36.489995 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:36.489963 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-5krlx_a5f2b79a-f0e6-447c-b3df-87cc295c9033/manager/0.log" Apr 22 16:13:36.603810 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:36.603777 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-54dfb4598d-5v6q8_0edae815-51c6-4665-9eb8-b2908ac3053e/manager/0.log" Apr 22 16:13:36.940960 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:36.940917 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-qdc6q_2e8bd008-6134-4bfb-ac10-4e4903be78ea/postgres/0.log" Apr 22 16:13:38.891713 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:38.891677 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-t64x5_f538ea4a-710e-4b41-bc9c-ccb286ac8d8d/manager/0.log" Apr 22 16:13:39.319291 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:39.319259 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-pgtbn_184a27b6-8d63-420b-ab37-a48184d2303c/discovery/0.log" Apr 22 16:13:39.423151 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:39.423121 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6c4b9b554-ldk8l_75257664-1fa7-401a-9046-9a756b9e9335/kube-auth-proxy/0.log" Apr 22 16:13:39.732034 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:39.731998 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7dd4c648bd-28stq_a0f34ab0-9da5-40dc-8537-84795ca7da5f/router/0.log" Apr 22 16:13:40.070063 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:40.069975 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp_05b98a06-4417-4bf3-ad6c-e14be54bf8f9/main/0.log" Apr 22 16:13:40.076729 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:40.076698 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-zfntp_05b98a06-4417-4bf3-ad6c-e14be54bf8f9/storage-initializer/0.log" Apr 22 16:13:40.412711 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:40.412620 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r_383a13bb-93f8-45e4-8d06-d0257e8b8a94/storage-initializer/0.log" Apr 22 16:13:40.418859 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:40.418814 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc8kh7r_383a13bb-93f8-45e4-8d06-d0257e8b8a94/main/0.log" Apr 22 16:13:43.323470 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:43.323439 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/ovn-acl-logging/0.log" Apr 22 16:13:43.327276 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:43.327247 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/ovn-acl-logging/0.log" Apr 22 16:13:47.229324 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:47.229289 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-tv4kw_b41e7195-65f0-4477-b983-c3847499b874/global-pull-secret-syncer/0.log" Apr 22 16:13:47.274874 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:47.274840 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4qb9z_2f956ae0-432f-4703-aab0-141a0a0a573c/konnectivity-agent/0.log" Apr 22 16:13:47.349277 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:47.349248 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-57.ec2.internal_ad918dee123885abb804caebda37d740/haproxy/0.log" Apr 22 16:13:52.153137 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:52.153081 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-t64x5_f538ea4a-710e-4b41-bc9c-ccb286ac8d8d/manager/0.log" Apr 22 16:13:53.537331 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:53.537299 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4b3bf771-ec61-4f1b-b902-111ac6068823/alertmanager/0.log" Apr 22 16:13:53.577664 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:53.577637 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4b3bf771-ec61-4f1b-b902-111ac6068823/config-reloader/0.log" Apr 22 16:13:53.599909 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:53.599863 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4b3bf771-ec61-4f1b-b902-111ac6068823/kube-rbac-proxy-web/0.log" Apr 22 16:13:53.620654 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:53.620626 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4b3bf771-ec61-4f1b-b902-111ac6068823/kube-rbac-proxy/0.log" Apr 22 16:13:53.645063 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:53.645040 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4b3bf771-ec61-4f1b-b902-111ac6068823/kube-rbac-proxy-metric/0.log" Apr 22 16:13:53.668219 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:53.668192 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4b3bf771-ec61-4f1b-b902-111ac6068823/prom-label-proxy/0.log" Apr 22 16:13:53.690559 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:53.690508 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4b3bf771-ec61-4f1b-b902-111ac6068823/init-config-reloader/0.log" Apr 22 16:13:53.727227 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:53.727187 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-grltq_770ecfe6-c426-49c5-aa5c-da656e2dc3c3/cluster-monitoring-operator/0.log" Apr 22 16:13:53.751735 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:53.751702 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rcqz5_490859ab-6bc0-4c2f-ad64-89b70d258a54/kube-state-metrics/0.log" Apr 22 16:13:53.780690 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:53.780657 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rcqz5_490859ab-6bc0-4c2f-ad64-89b70d258a54/kube-rbac-proxy-main/0.log" Apr 22 16:13:53.805917 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:53.805832 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rcqz5_490859ab-6bc0-4c2f-ad64-89b70d258a54/kube-rbac-proxy-self/0.log" Apr 22 16:13:53.881592 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:53.881563 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-5pvh8_70f25860-e176-408f-8756-ca502709bcc8/monitoring-plugin/0.log" Apr 22 16:13:53.996622 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:53.996589 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tmcqz_f20cdfa6-60b8-47c1-8cb5-a10b7f43e303/node-exporter/0.log" Apr 22 16:13:54.017579 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:54.017551 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tmcqz_f20cdfa6-60b8-47c1-8cb5-a10b7f43e303/kube-rbac-proxy/0.log" Apr 22 16:13:54.041815 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:54.041785 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tmcqz_f20cdfa6-60b8-47c1-8cb5-a10b7f43e303/init-textfile/0.log" Apr 22 16:13:56.008056 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.008016 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf"] Apr 22 16:13:56.011831 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.011807 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:56.014159 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.014134 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b5p7f\"/\"kube-root-ca.crt\"" Apr 22 16:13:56.014986 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.014959 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-b5p7f\"/\"default-dockercfg-bfrf8\"" Apr 22 16:13:56.015169 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.014958 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b5p7f\"/\"openshift-service-ca.crt\"" Apr 22 16:13:56.018389 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.018356 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf"] Apr 22 16:13:56.118545 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.118498 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/27527133-f4d7-435b-b750-ea16b0e70973-podres\") pod \"perf-node-gather-daemonset-lg7nf\" (UID: \"27527133-f4d7-435b-b750-ea16b0e70973\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:56.118747 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.118655 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/27527133-f4d7-435b-b750-ea16b0e70973-sys\") pod \"perf-node-gather-daemonset-lg7nf\" (UID: \"27527133-f4d7-435b-b750-ea16b0e70973\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:56.118747 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.118688 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/27527133-f4d7-435b-b750-ea16b0e70973-lib-modules\") pod \"perf-node-gather-daemonset-lg7nf\" (UID: \"27527133-f4d7-435b-b750-ea16b0e70973\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:56.118747 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.118713 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/27527133-f4d7-435b-b750-ea16b0e70973-proc\") pod \"perf-node-gather-daemonset-lg7nf\" (UID: \"27527133-f4d7-435b-b750-ea16b0e70973\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:56.118869 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.118810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz4zl\" (UniqueName: \"kubernetes.io/projected/27527133-f4d7-435b-b750-ea16b0e70973-kube-api-access-qz4zl\") pod \"perf-node-gather-daemonset-lg7nf\" (UID: \"27527133-f4d7-435b-b750-ea16b0e70973\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:56.219388 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.219353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qz4zl\" (UniqueName: \"kubernetes.io/projected/27527133-f4d7-435b-b750-ea16b0e70973-kube-api-access-qz4zl\") pod \"perf-node-gather-daemonset-lg7nf\" (UID: \"27527133-f4d7-435b-b750-ea16b0e70973\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:56.219612 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.219428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/27527133-f4d7-435b-b750-ea16b0e70973-podres\") pod \"perf-node-gather-daemonset-lg7nf\" (UID: \"27527133-f4d7-435b-b750-ea16b0e70973\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:56.219612 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.219470 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/27527133-f4d7-435b-b750-ea16b0e70973-sys\") pod \"perf-node-gather-daemonset-lg7nf\" (UID: \"27527133-f4d7-435b-b750-ea16b0e70973\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:56.219612 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.219492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/27527133-f4d7-435b-b750-ea16b0e70973-lib-modules\") pod \"perf-node-gather-daemonset-lg7nf\" (UID: \"27527133-f4d7-435b-b750-ea16b0e70973\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:56.219612 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.219584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/27527133-f4d7-435b-b750-ea16b0e70973-proc\") pod \"perf-node-gather-daemonset-lg7nf\" (UID: \"27527133-f4d7-435b-b750-ea16b0e70973\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:56.219769 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.219611 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/27527133-f4d7-435b-b750-ea16b0e70973-sys\") pod \"perf-node-gather-daemonset-lg7nf\" (UID: \"27527133-f4d7-435b-b750-ea16b0e70973\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:56.219769 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.219642 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/27527133-f4d7-435b-b750-ea16b0e70973-podres\") pod \"perf-node-gather-daemonset-lg7nf\" (UID: \"27527133-f4d7-435b-b750-ea16b0e70973\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:56.220030 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.220009 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/27527133-f4d7-435b-b750-ea16b0e70973-lib-modules\") pod \"perf-node-gather-daemonset-lg7nf\" (UID: \"27527133-f4d7-435b-b750-ea16b0e70973\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:56.220339 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.220298 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/27527133-f4d7-435b-b750-ea16b0e70973-proc\") pod \"perf-node-gather-daemonset-lg7nf\" (UID: \"27527133-f4d7-435b-b750-ea16b0e70973\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:56.227791 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.227754 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz4zl\" (UniqueName: \"kubernetes.io/projected/27527133-f4d7-435b-b750-ea16b0e70973-kube-api-access-qz4zl\") pod \"perf-node-gather-daemonset-lg7nf\" (UID: \"27527133-f4d7-435b-b750-ea16b0e70973\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:56.324555 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.324423 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:56.466067 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.466032 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf"] Apr 22 16:13:56.468704 ip-10-0-132-57 kubenswrapper[2576]: W0422 16:13:56.468670 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod27527133_f4d7_435b_b750_ea16b0e70973.slice/crio-2e85ebed18ef0d62a5eae1e24f471c7c1b4bd90c827d1aeee32383d4f5611a3a WatchSource:0}: Error finding container 2e85ebed18ef0d62a5eae1e24f471c7c1b4bd90c827d1aeee32383d4f5611a3a: Status 404 returned error can't find the container with id 2e85ebed18ef0d62a5eae1e24f471c7c1b4bd90c827d1aeee32383d4f5611a3a Apr 22 16:13:56.793366 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.793325 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7fcf674f7-rsdpn_4f2e4215-d989-4582-a4fd-50e024290fda/console/0.log" Apr 22 16:13:56.823198 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:56.823157 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-wnt9h_4614c8ab-95b2-46ee-b4a6-3b5e1953986f/download-server/0.log" Apr 22 16:13:57.081745 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:57.081709 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" event={"ID":"27527133-f4d7-435b-b750-ea16b0e70973","Type":"ContainerStarted","Data":"9573aff39d0762f975fef0649f3615e446467f9d6f035754b634bafe2c85b23e"} Apr 22 16:13:57.081745 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:57.081748 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" event={"ID":"27527133-f4d7-435b-b750-ea16b0e70973","Type":"ContainerStarted","Data":"2e85ebed18ef0d62a5eae1e24f471c7c1b4bd90c827d1aeee32383d4f5611a3a"} Apr 22 16:13:57.082180 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:57.081784 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:13:57.095933 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:57.095877 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" podStartSLOduration=2.095859413 podStartE2EDuration="2.095859413s" podCreationTimestamp="2026-04-22 16:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:13:57.093518891 +0000 UTC m=+914.385707614" watchObservedRunningTime="2026-04-22 16:13:57.095859413 +0000 UTC m=+914.388048136" Apr 22 16:13:58.222566 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:58.222539 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-njwkw_c54ba0ce-4b35-4707-8ee7-608cb358834b/dns/0.log" Apr 22 16:13:58.244895 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:58.244863 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-njwkw_c54ba0ce-4b35-4707-8ee7-608cb358834b/kube-rbac-proxy/0.log" Apr 22 16:13:58.316774 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:58.316748 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nq959_0f691780-d61d-4734-a36c-01d15ac43908/dns-node-resolver/0.log" Apr 22 16:13:58.789823 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:58.789793 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-8956b97cd-qm8h9_c4c85ad4-4658-4c87-9205-179acf53b17d/registry/0.log" Apr 22 16:13:58.812216 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:58.812173 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4gt4d_284f7a34-e743-4f00-9226-bfcbfbabe4a4/node-ca/0.log" Apr 22 16:13:59.744900 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:59.744849 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-pgtbn_184a27b6-8d63-420b-ab37-a48184d2303c/discovery/0.log" Apr 22 16:13:59.763992 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:59.763959 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6c4b9b554-ldk8l_75257664-1fa7-401a-9046-9a756b9e9335/kube-auth-proxy/0.log" Apr 22 16:13:59.843636 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:13:59.843601 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7dd4c648bd-28stq_a0f34ab0-9da5-40dc-8537-84795ca7da5f/router/0.log" Apr 22 16:14:00.398445 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:00.398406 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-57g78_bb23ac61-dcc3-40eb-a485-4e58d6ea6d04/serve-healthcheck-canary/0.log" Apr 22 16:14:01.013866 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:01.013830 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-r6v78_27117f2e-c258-4f10-8c53-d2baf49a8e53/kube-rbac-proxy/0.log" Apr 22 16:14:01.034904 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:01.034875 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-r6v78_27117f2e-c258-4f10-8c53-d2baf49a8e53/exporter/0.log" Apr 22 16:14:01.055990 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:01.055964 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-r6v78_27117f2e-c258-4f10-8c53-d2baf49a8e53/extractor/0.log" Apr 22 16:14:02.890113 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:02.890076 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-tvp8f_4f6f0783-eae9-4ae1-92ed-e4430af515bf/manager/0.log" Apr 22 16:14:02.915722 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:02.915690 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-564bc75589-tmk9g_6156f868-633a-43aa-a205-4e0ce2b2d6d8/maas-api/0.log" Apr 22 16:14:02.943347 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:02.943318 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-654cfbcb98-m54wm_82599c78-2ffb-4c8e-8b07-7de3b91ef416/manager/0.log" Apr 22 16:14:02.977223 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:02.977179 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-5krlx_a5f2b79a-f0e6-447c-b3df-87cc295c9033/manager/0.log" Apr 22 16:14:02.999894 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:02.999865 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-54dfb4598d-5v6q8_0edae815-51c6-4665-9eb8-b2908ac3053e/manager/0.log" Apr 22 16:14:03.097444 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:03.097406 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-lg7nf" Apr 22 16:14:03.101225 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:03.101199 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-qdc6q_2e8bd008-6134-4bfb-ac10-4e4903be78ea/postgres/0.log" Apr 22 16:14:04.226661 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:04.226626 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-wgxq2_d5bb611c-c5a3-46c4-83d0-19da2d69dc62/openshift-lws-operator/0.log" Apr 22 16:14:10.372850 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:10.372816 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kvscs_ea3eaabc-bec9-4b13-b4f1-f400b42ea71a/kube-multus-additional-cni-plugins/0.log" Apr 22 16:14:10.396037 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:10.396009 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kvscs_ea3eaabc-bec9-4b13-b4f1-f400b42ea71a/egress-router-binary-copy/0.log" Apr 22 16:14:10.417480 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:10.417454 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kvscs_ea3eaabc-bec9-4b13-b4f1-f400b42ea71a/cni-plugins/0.log" Apr 22 16:14:10.439831 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:10.439804 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kvscs_ea3eaabc-bec9-4b13-b4f1-f400b42ea71a/bond-cni-plugin/0.log" Apr 22 16:14:10.461387 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:10.461361 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kvscs_ea3eaabc-bec9-4b13-b4f1-f400b42ea71a/routeoverride-cni/0.log" Apr 22 16:14:10.482225 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:10.482204 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kvscs_ea3eaabc-bec9-4b13-b4f1-f400b42ea71a/whereabouts-cni-bincopy/0.log" Apr 22 16:14:10.504044 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:10.504014 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kvscs_ea3eaabc-bec9-4b13-b4f1-f400b42ea71a/whereabouts-cni/0.log" Apr 22 16:14:10.571591 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:10.571522 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nh7h2_91027f2c-ef91-41c1-a5c4-9c43eba2e5e5/kube-multus/0.log" Apr 22 16:14:10.634979 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:10.634875 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5v2vn_13a488e0-8f15-4fd1-8913-c002ea52d186/network-metrics-daemon/0.log" Apr 22 16:14:10.655339 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:10.655284 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5v2vn_13a488e0-8f15-4fd1-8913-c002ea52d186/kube-rbac-proxy/0.log" Apr 22 16:14:11.817949 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:11.817910 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/ovn-controller/0.log" Apr 22 16:14:11.838271 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:11.838229 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/ovn-acl-logging/0.log" Apr 22 16:14:11.842777 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:11.842745 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/ovn-acl-logging/1.log" Apr 22 16:14:11.865746 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:11.865718 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/kube-rbac-proxy-node/0.log" Apr 22 16:14:11.887760 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:11.887733 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 16:14:11.907607 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:11.907574 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/northd/0.log" Apr 22 16:14:11.928710 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:11.928683 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/nbdb/0.log" Apr 22 16:14:11.951805 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:11.951771 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/sbdb/0.log" Apr 22 16:14:12.065197 ip-10-0-132-57 kubenswrapper[2576]: I0422 16:14:12.065151 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxnsg_d3e14dff-8806-4eb4-92e8-68169209c285/ovnkube-controller/0.log"