Apr 16 18:01:59.587734 ip-10-0-138-15 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:02:00.046135 ip-10-0-138-15 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:00.046135 ip-10-0-138-15 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:02:00.046135 ip-10-0-138-15 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:00.046135 ip-10-0-138-15 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:02:00.046135 ip-10-0-138-15 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:00.047741 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.047656 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:02:00.051000 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.050985 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:00.051000 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051001 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051004 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051007 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051010 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051013 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051016 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051020 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051024 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051028 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051030 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051033 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051036 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051039 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051046 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051050 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051052 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051055 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051058 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051060 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051063 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:00.051059 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051066 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051069 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051072 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051075 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051078 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051080 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051083 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051085 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051088 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051091 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051094 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051096 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051099 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051101 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051104 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051108 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051111 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051113 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051116 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051118 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:00.051549 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051121 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051123 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051125 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051129 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051131 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051134 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051137 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051140 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051142 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051145 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051147 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051150 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051152 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051155 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051158 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051161 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051165 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051168 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051172 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:00.052028 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051176 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051179 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051182 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051184 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051187 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051190 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051193 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051196 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051199 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051201 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051204 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051207 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051209 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051212 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051214 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051217 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051219 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051222 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051224 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:00.052513 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051227 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051230 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051233 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051235 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051238 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051240 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051244 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051639 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051645 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051649 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051652 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051655 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051658 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051661 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051663 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051666 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051669 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051671 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051674 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:00.052963 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051677 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051680 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051683 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051685 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051688 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051691 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051693 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051696 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051698 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051701 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051704 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051706 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051709 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051711 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051714 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051717 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051722 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051725 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051728 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:00.053464 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051731 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051734 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051740 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051743 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051747 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051749 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051752 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051754 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051757 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051759 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051762 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051765 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051767 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051770 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051772 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051775 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051778 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051780 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051783 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051786 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:00.053961 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051788 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051791 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051794 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051796 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051799 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051802 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051804 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051807 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051810 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051813 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051816 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051818 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051821 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051823 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051827 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051830 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051833 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051835 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051838 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051840 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:00.054465 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051843 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051846 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051848 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051851 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051853 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051856 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051858 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051861 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051863 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051866 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051868 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051871 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051873 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051876 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.051878 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052532 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052541 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052548 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052558 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052562 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052566 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:02:00.054947 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052571 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052576 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052580 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052583 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052587 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052591 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052594 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052598 2574 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052601 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052604 2574 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052607 2574 flags.go:64] FLAG: --cloud-config="" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052610 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052612 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052617 2574 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052620 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052622 2574 flags.go:64] FLAG: --config-dir="" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052625 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052629 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052633 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052636 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052639 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052642 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052646 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052649 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:02:00.055507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052652 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052656 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052659 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052664 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052667 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052670 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052673 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052676 2574 flags.go:64] FLAG: --enable-server="true" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052679 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052684 2574 flags.go:64] FLAG: --event-burst="100" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052688 2574 flags.go:64] FLAG: --event-qps="50" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052691 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052694 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052697 2574 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052701 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052704 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052707 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052710 2574 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052713 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052716 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052719 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052722 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052725 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052728 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052731 2574 flags.go:64] FLAG: --feature-gates="" Apr 16 18:02:00.056084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052735 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052738 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052742 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052745 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052748 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052751 2574 flags.go:64] FLAG: --help="false" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052754 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052757 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052761 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052764 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052767 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052771 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052774 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052777 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052780 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052783 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052786 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052790 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052793 2574 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052796 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052798 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052801 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052804 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052807 2574 flags.go:64] FLAG: --lock-file="" Apr 16 18:02:00.056696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052810 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052813 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052817 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052823 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052826 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052829 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052832 2574 flags.go:64] FLAG: --logging-format="text" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052835 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052839 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052842 2574 flags.go:64] FLAG: --manifest-url="" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052844 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052849 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052852 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052856 2574 flags.go:64] FLAG: --max-pods="110" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052863 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052866 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052869 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052872 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052875 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052878 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052881 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052889 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052892 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052895 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:02:00.057285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052898 2574 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052901 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052911 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052914 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052917 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052920 2574 flags.go:64] FLAG: --port="10250" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052923 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052926 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0d75ecbd0915cfe65" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052930 2574 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052933 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052937 2574 flags.go:64] FLAG: --register-node="true" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052940 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052943 2574 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052947 2574 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052950 2574 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052953 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052955 2574 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052959 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052962 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052965 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052969 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052972 2574 flags.go:64] FLAG: --runonce="false" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052974 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052979 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052982 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:02:00.057935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052985 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052988 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052991 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052994 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.052997 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053000 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053003 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053006 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053009 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053012 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053015 2574 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053018 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053024 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053026 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053030 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053034 2574 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053037 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053041 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053044 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053047 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053050 2574 flags.go:64] FLAG: --v="2" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053055 2574 flags.go:64] FLAG: --version="false" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053059 2574 flags.go:64] FLAG: --vmodule="" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053063 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.053066 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:02:00.058548 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053160 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053164 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053168 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053171 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053174 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053178 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053181 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053185 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053187 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053190 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053192 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053195 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053198 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053200 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053203 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053206 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053209 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053211 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053214 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:00.059136 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053217 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053221 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053224 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053227 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053230 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053234 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053237 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053239 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053242 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053244 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053247 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053250 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053252 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053255 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053257 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053260 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053262 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053265 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053269 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:00.059920 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053272 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053274 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053277 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053280 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053282 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053285 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053287 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053290 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053292 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053295 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053298 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053300 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053302 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053305 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053319 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053323 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053325 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053328 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053332 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053334 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:00.060584 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053337 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053339 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053342 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053346 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053349 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053352 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053355 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053357 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053360 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053362 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053378 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053383 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053385 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053388 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053390 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053393 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053396 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053399 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053401 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:00.061083 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053404 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:00.061598 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053407 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:00.061598 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053409 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:00.061598 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053412 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:00.061598 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053414 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:00.061598 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053417 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:00.061598 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053420 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:00.061598 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053423 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:00.061598 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.053426 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:00.061598 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.054117 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:00.061598 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.061590 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.061605 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061653 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061658 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061662 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061664 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061668 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061671 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061674 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061676 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061679 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061682 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061685 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061688 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061690 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061693 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061696 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061698 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061701 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:00.061854 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061704 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061706 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061709 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061711 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061714 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061717 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061720 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061723 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061726 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061728 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061731 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061734 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061736 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061739 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061742 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061746 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061749 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061751 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061754 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061757 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:00.062323 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061759 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061762 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061765 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061767 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061771 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061774 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061776 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061779 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061781 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061784 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061786 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061789 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061791 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061794 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061797 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061800 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061802 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061805 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061809 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061814 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:00.062830 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061817 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061820 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061823 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061825 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061828 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061830 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061833 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061836 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061839 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061842 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061844 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061849 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061852 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061855 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061858 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061860 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061863 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061866 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061869 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061872 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:00.063307 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061875 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:00.063817 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061877 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:00.063817 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061880 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:00.063817 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061882 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:00.063817 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061885 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:00.063817 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061888 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:00.063817 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061890 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:00.063817 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061893 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:00.063817 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061895 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:00.063817 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.061900 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:00.063817 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061993 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:00.063817 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.061998 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:00.063817 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062001 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:00.063817 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062004 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:00.063817 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062007 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:00.063817 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062010 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:00.063817 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062012 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062015 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062018 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062021 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062024 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062027 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062030 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062033 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062035 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062038 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062041 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062043 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062046 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062048 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062051 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062054 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062057 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062060 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062062 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062065 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:00.064209 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062067 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062070 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062072 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062075 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062077 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062081 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062085 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062088 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062091 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062093 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062096 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062099 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062102 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062104 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062107 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062110 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062113 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062116 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062119 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:00.064722 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062121 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062124 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062127 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062129 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062132 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062135 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062138 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062140 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062143 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062146 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062149 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062153 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062156 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062158 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062161 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062163 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062166 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062168 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062171 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:00.065204 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062173 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062176 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062178 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062181 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062183 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062186 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062188 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062191 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062194 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062196 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062199 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062202 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062205 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062207 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062210 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062213 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062215 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062218 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062220 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:00.065686 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062224 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:00.066162 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062226 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:00.066162 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:00.062228 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:00.066162 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.062233 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:00.066162 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.062917 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:02:00.066671 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.066657 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:02:00.067553 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.067542 2574 server.go:1019] "Starting client certificate rotation" Apr 16 18:02:00.067657 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.067640 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:02:00.067690 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.067682 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:02:00.090834 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.090819 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:02:00.093213 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.093178 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:02:00.108840 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.108813 2574 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:02:00.116426 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.116407 2574 log.go:25] "Validated CRI v1 image API" Apr 16 18:02:00.117661 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.117647 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:02:00.121448 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.121428 2574 fs.go:135] Filesystem UUIDs: map[4e23bc04-a03b-4f60-a692-3089868f79ac:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 8872dada-1be8-41ac-ae48-b5b38e53a050:/dev/nvme0n1p4] Apr 16 18:02:00.121521 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.121447 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:02:00.127256 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.127239 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:02:00.127578 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.127475 2574 manager.go:217] Machine: {Timestamp:2026-04-16 18:02:00.125725543 +0000 UTC m=+0.403139427 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100016 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec276a255cd9e202468e2f7cb97a3f01 SystemUUID:ec276a25-5cd9-e202-468e-2f7cb97a3f01 BootID:0311f180-3b0d-4115-9c1c-117308948c6c Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:11:a2:dc:32:a3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:11:a2:dc:32:a3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a6:be:eb:73:65:4b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:02:00.127774 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.127765 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:02:00.127863 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.127852 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:02:00.128913 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.128892 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:02:00.129044 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.128916 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-15.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:02:00.129087 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.129053 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:02:00.129087 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.129062 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:02:00.129087 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.129075 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:02:00.129173 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.129092 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:02:00.130638 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.130625 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:02:00.130740 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.130731 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:02:00.133221 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.133211 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:02:00.133257 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.133225 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:02:00.133257 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.133237 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:02:00.133257 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.133246 2574 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:02:00.133406 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.133263 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:02:00.134235 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.134223 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:02:00.134270 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.134243 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:02:00.137301 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.137284 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:02:00.138938 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.138921 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:02:00.140110 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.140095 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:02:00.140179 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.140119 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:02:00.140179 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.140130 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:02:00.140179 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.140142 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:02:00.140179 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.140154 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:02:00.140359 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.140196 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:02:00.140359 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.140212 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:02:00.140359 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.140224 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:02:00.140359 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.140239 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:02:00.140359 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.140253 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:02:00.140359 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.140287 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:02:00.140359 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.140306 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:02:00.141355 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.141342 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:02:00.141424 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.141383 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:02:00.142139 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.142112 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-15.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:02:00.142513 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.142488 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:02:00.144818 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.144800 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-15.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:02:00.144927 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.144914 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:02:00.144982 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.144955 2574 server.go:1295] "Started kubelet" Apr 16 18:02:00.145105 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.145047 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:02:00.145186 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.145122 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:02:00.145229 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.145213 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:02:00.145967 ip-10-0-138-15 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:02:00.146364 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.146299 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:02:00.147351 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.147338 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:02:00.150592 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.150569 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-s9prg" Apr 16 18:02:00.153362 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.153346 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:02:00.153362 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.153358 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:02:00.154084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.154067 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:02:00.154084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.154068 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:02:00.154224 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.154093 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:02:00.154224 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.154188 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:02:00.154224 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.154187 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 16 18:02:00.154224 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.154195 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:02:00.154530 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.154511 2574 factory.go:153] Registering CRI-O factory Apr 16 18:02:00.154530 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.154532 2574 factory.go:223] Registration of the crio container factory successfully Apr 16 18:02:00.154643 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.154590 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:02:00.154643 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.154601 2574 factory.go:55] Registering systemd factory Apr 16 18:02:00.154643 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.154609 2574 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:02:00.154643 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.154634 2574 factory.go:103] Registering Raw factory Apr 16 18:02:00.154755 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.154649 2574 manager.go:1196] Started watching for new ooms in manager Apr 16 18:02:00.154973 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.154962 2574 manager.go:319] Starting recovery of all containers Apr 16 18:02:00.155274 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.154192 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-15.ec2.internal.18a6e84ced8e59c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-15.ec2.internal,UID:ip-10-0-138-15.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-15.ec2.internal,},FirstTimestamp:2026-04-16 18:02:00.14492717 +0000 UTC m=+0.422341064,LastTimestamp:2026-04-16 18:02:00.14492717 +0000 UTC m=+0.422341064,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-15.ec2.internal,}" Apr 16 18:02:00.155432 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.155413 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:02:00.158109 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.158087 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-s9prg" Apr 16 18:02:00.164136 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.163938 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:02:00.164663 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.164634 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-15.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 18:02:00.165788 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.165775 2574 manager.go:324] Recovery completed Apr 16 18:02:00.170290 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.170275 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:00.172860 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.172846 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:00.172938 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.172871 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:00.172938 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.172895 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:00.173402 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.173385 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:02:00.173402 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.173401 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:02:00.173516 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.173418 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:02:00.175530 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.175516 2574 policy_none.go:49] "None policy: Start" Apr 16 18:02:00.175530 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.175532 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:02:00.175623 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.175542 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:02:00.208934 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.208910 2574 manager.go:341] "Starting Device Plugin manager" Apr 16 18:02:00.223654 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.208947 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:02:00.223654 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.208967 2574 server.go:85] "Starting device plugin registration server" Apr 16 18:02:00.223654 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.209210 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:02:00.223654 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.209221 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:02:00.223654 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.209307 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:02:00.223654 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.209434 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:02:00.223654 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.209443 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:02:00.223654 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.209852 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:02:00.223654 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.209893 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-15.ec2.internal\" not found" Apr 16 18:02:00.289094 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.289072 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:02:00.290316 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.290294 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:02:00.290411 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.290337 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:02:00.290411 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.290357 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:02:00.290411 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.290382 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:02:00.290558 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.290422 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:02:00.294651 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.294635 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:00.309989 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.309955 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:00.310683 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.310663 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:00.310760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.310692 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:00.310760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.310704 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:00.310760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.310722 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.319753 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.319734 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.319838 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.319756 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-15.ec2.internal\": node \"ip-10-0-138-15.ec2.internal\" not found" Apr 16 18:02:00.341587 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.341568 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 16 18:02:00.391317 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.391295 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal"] Apr 16 18:02:00.391420 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.391353 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:00.392805 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.392785 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:00.392887 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.392815 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:00.392887 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.392825 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:00.393908 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.393895 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:00.394073 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.394058 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.394125 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.394092 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:00.394955 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.394939 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:00.395039 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.394968 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:00.395039 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.394942 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:00.395039 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.395004 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:00.395039 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.395016 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:00.395039 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.394982 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:00.396641 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.396624 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.396719 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.396652 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:00.397445 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.397422 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:00.397515 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.397456 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:00.397515 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.397468 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:00.427944 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.427924 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-15.ec2.internal\" not found" node="ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.432021 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.432005 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-15.ec2.internal\" not found" node="ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.442032 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.442014 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 16 18:02:00.455616 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.455598 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/96d5b24163c5b9fd60b620b912892f8b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal\" (UID: \"96d5b24163c5b9fd60b620b912892f8b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.455696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.455627 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/96d5b24163c5b9fd60b620b912892f8b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal\" (UID: \"96d5b24163c5b9fd60b620b912892f8b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.455696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.455654 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ede70444dfe8774ee4b88dac28542853-config\") pod \"kube-apiserver-proxy-ip-10-0-138-15.ec2.internal\" (UID: \"ede70444dfe8774ee4b88dac28542853\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.542499 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.542482 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 16 18:02:00.555933 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.555906 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/96d5b24163c5b9fd60b620b912892f8b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal\" (UID: \"96d5b24163c5b9fd60b620b912892f8b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.556002 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.555942 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/96d5b24163c5b9fd60b620b912892f8b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal\" (UID: \"96d5b24163c5b9fd60b620b912892f8b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.556002 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.555964 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ede70444dfe8774ee4b88dac28542853-config\") pod \"kube-apiserver-proxy-ip-10-0-138-15.ec2.internal\" (UID: \"ede70444dfe8774ee4b88dac28542853\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.556098 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.556016 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/96d5b24163c5b9fd60b620b912892f8b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal\" (UID: \"96d5b24163c5b9fd60b620b912892f8b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.556098 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.556038 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/96d5b24163c5b9fd60b620b912892f8b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal\" (UID: \"96d5b24163c5b9fd60b620b912892f8b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.556098 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.556032 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ede70444dfe8774ee4b88dac28542853-config\") pod \"kube-apiserver-proxy-ip-10-0-138-15.ec2.internal\" (UID: \"ede70444dfe8774ee4b88dac28542853\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.643301 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.643258 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 16 18:02:00.730691 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.730668 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.734015 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:00.734000 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" Apr 16 18:02:00.743858 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.743839 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 16 18:02:00.844409 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.844385 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 16 18:02:00.944878 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:00.944829 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 16 18:02:01.045521 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:01.045496 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 16 18:02:01.066980 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.066953 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:02:01.067403 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.067086 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:02:01.145615 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:01.145592 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 16 18:02:01.153640 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.153622 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:02:01.163620 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.163591 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 17:57:00 +0000 UTC" deadline="2027-09-10 06:15:27.050057536 +0000 UTC" Apr 16 18:02:01.163672 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.163618 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12276h13m25.886442038s" Apr 16 18:02:01.165056 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.165041 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:02:01.189789 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.189772 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rkp2k" Apr 16 18:02:01.197026 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.196989 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:01.197165 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.197031 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rkp2k" Apr 16 18:02:01.246624 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:01.246600 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 16 18:02:01.280888 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:01.280858 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podede70444dfe8774ee4b88dac28542853.slice/crio-607b990bd083a0dd2b6112c951ffb48798645423d83a32db0b735c096647f72a WatchSource:0}: Error finding container 607b990bd083a0dd2b6112c951ffb48798645423d83a32db0b735c096647f72a: Status 404 returned error can't find the container with id 607b990bd083a0dd2b6112c951ffb48798645423d83a32db0b735c096647f72a Apr 16 18:02:01.286431 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.286418 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:02:01.292741 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.292701 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" event={"ID":"ede70444dfe8774ee4b88dac28542853","Type":"ContainerStarted","Data":"607b990bd083a0dd2b6112c951ffb48798645423d83a32db0b735c096647f72a"} Apr 16 18:02:01.347079 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:01.347057 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 16 18:02:01.400247 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.400232 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:01.447461 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:01.447420 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 16 18:02:01.517684 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.517662 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:01.553958 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.553942 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 16 18:02:01.565836 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.565820 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:02:01.566875 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.566864 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" Apr 16 18:02:01.575038 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.575026 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:02:01.618341 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:01.618319 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96d5b24163c5b9fd60b620b912892f8b.slice/crio-63618fa95a4609180018a3b20a5b1f5834f178608dee5da961030e2b906bfc32 WatchSource:0}: Error finding container 63618fa95a4609180018a3b20a5b1f5834f178608dee5da961030e2b906bfc32: Status 404 returned error can't find the container with id 63618fa95a4609180018a3b20a5b1f5834f178608dee5da961030e2b906bfc32 Apr 16 18:02:01.960541 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:01.960512 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:02.134796 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.134764 2574 apiserver.go:52] "Watching apiserver" Apr 16 18:02:02.141597 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.141570 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:02:02.143928 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.143894 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-tkqz7","openshift-network-operator/iptables-alerter-v9kdc","openshift-ovn-kubernetes/ovnkube-node-mv8js","kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal","openshift-cluster-node-tuning-operator/tuned-2mtr8","openshift-dns/node-resolver-2nlc5","openshift-multus/multus-additional-cni-plugins-frw6p","openshift-multus/network-metrics-daemon-24pqv","kube-system/konnectivity-agent-5h65c","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q","openshift-image-registry/node-ca-qwbnl","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal","openshift-multus/multus-w2x27"] Apr 16 18:02:02.146295 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.146272 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.147680 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.147657 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-v9kdc" Apr 16 18:02:02.148207 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.148182 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:02:02.148303 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.148187 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:02:02.148303 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.148228 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qktfc\"" Apr 16 18:02:02.148303 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.148188 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:02:02.149464 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.149444 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:02:02.149591 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.149473 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-pcbgg\"" Apr 16 18:02:02.149591 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.149580 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:02:02.149948 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.149928 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.150564 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.150543 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:02:02.151431 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.151116 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.152438 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.151780 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:02:02.152438 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.152004 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:02:02.152438 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.152082 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:02:02.152438 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.152139 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:02:02.152438 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.152221 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:02:02.152438 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.152264 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:02:02.152438 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.152324 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-c8lcb\"" Apr 16 18:02:02.152786 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.152537 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2nlc5" Apr 16 18:02:02.152786 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.152635 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:02:02.153141 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.153123 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:02:02.153232 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.153206 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zxmdm\"" Apr 16 18:02:02.154091 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.154072 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:02:02.154272 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.154256 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-rtd5c\"" Apr 16 18:02:02.154869 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.154854 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:02:02.155502 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.155482 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.157852 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.156779 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:02.157852 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:02.156886 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pqv" podUID="9d69a3dd-c0fd-4764-b3d8-802189b16640" Apr 16 18:02:02.157852 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.157152 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:02:02.157852 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.157183 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:02:02.157852 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.157411 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:02:02.157852 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.157555 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:02:02.157852 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.157555 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6w628\"" Apr 16 18:02:02.157852 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.157658 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:02:02.158270 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.158107 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5h65c" Apr 16 18:02:02.158270 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.158225 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:02.158545 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:02.158521 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkqz7" podUID="299ac103-77d7-4ae3-b981-8c66d39e67eb" Apr 16 18:02:02.159880 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.159862 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:02:02.160281 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.160262 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-d8fxd\"" Apr 16 18:02:02.160353 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.160332 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:02:02.162081 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.162065 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qwbnl" Apr 16 18:02:02.162474 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.162118 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.163919 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.163900 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pplkx\"" Apr 16 18:02:02.164413 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.164395 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:02:02.164833 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.164771 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9f7s6\"" Apr 16 18:02:02.164936 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.164839 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-cni-netd\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.164936 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.164870 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-etc-selinux\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.164936 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.164897 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-sys-fs\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.164936 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.164932 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-var-lib-kubelet\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.165124 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.164957 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-var-lib-openvswitch\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.165242 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.165211 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-socket-dir\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.165308 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.165268 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64cbfc80-dc30-49cf-a0eb-68130da967eb-host-slash\") pod \"iptables-alerter-v9kdc\" (UID: \"64cbfc80-dc30-49cf-a0eb-68130da967eb\") " pod="openshift-network-operator/iptables-alerter-v9kdc" Apr 16 18:02:02.165402 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.165361 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:02:02.166029 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.165784 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:02:02.166029 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.165837 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:02:02.168092 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.168066 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-tmp\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.168198 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.168144 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55bg4\" (UniqueName: \"kubernetes.io/projected/9d69a3dd-c0fd-4764-b3d8-802189b16640-kube-api-access-55bg4\") pod \"network-metrics-daemon-24pqv\" (UID: \"9d69a3dd-c0fd-4764-b3d8-802189b16640\") " pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:02.168257 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.168193 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-run-ovn\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.168257 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.168242 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/298a93d5-3bc0-4a9d-9dd0-922e60e48669-env-overrides\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.168353 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.168292 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-sysctl-d\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.168353 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.168324 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-lib-modules\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.168478 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.168355 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/298a93d5-3bc0-4a9d-9dd0-922e60e48669-ovn-node-metrics-cert\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.168478 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.168419 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-sysctl-conf\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.168573 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.168471 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-node-log\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.169381 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169051 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/298a93d5-3bc0-4a9d-9dd0-922e60e48669-ovnkube-script-lib\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.169468 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169410 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-run\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.169468 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169442 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-run-systemd\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.169591 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169471 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncpp9\" (UniqueName: \"kubernetes.io/projected/2f391f14-a93d-421b-8f8c-642ea53a1269-kube-api-access-ncpp9\") pod \"node-resolver-2nlc5\" (UID: \"2f391f14-a93d-421b-8f8c-642ea53a1269\") " pod="openshift-dns/node-resolver-2nlc5" Apr 16 18:02:02.169591 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169496 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-tuned\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.169591 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169521 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-host\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.169591 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169543 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/298a93d5-3bc0-4a9d-9dd0-922e60e48669-ovnkube-config\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.169591 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169566 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1e22989e-67d8-41f9-acfa-874e304428b5-system-cni-dir\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.169591 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169588 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.169850 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169624 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/64cbfc80-dc30-49cf-a0eb-68130da967eb-iptables-alerter-script\") pod \"iptables-alerter-v9kdc\" (UID: \"64cbfc80-dc30-49cf-a0eb-68130da967eb\") " pod="openshift-network-operator/iptables-alerter-v9kdc" Apr 16 18:02:02.169850 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169648 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-sysconfig\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.169850 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169669 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-sys\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.169850 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169692 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj2hg\" (UniqueName: \"kubernetes.io/projected/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-kube-api-access-nj2hg\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.169850 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169724 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-systemd-units\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.169850 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169746 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-run-netns\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.169850 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169793 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-etc-openvswitch\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.169850 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169834 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-cni-bin\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.170212 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169865 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f391f14-a93d-421b-8f8c-642ea53a1269-tmp-dir\") pod \"node-resolver-2nlc5\" (UID: \"2f391f14-a93d-421b-8f8c-642ea53a1269\") " pod="openshift-dns/node-resolver-2nlc5" Apr 16 18:02:02.170212 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169910 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1e22989e-67d8-41f9-acfa-874e304428b5-cnibin\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.170212 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169946 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1e22989e-67d8-41f9-acfa-874e304428b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.170212 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.169977 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7nm5\" (UniqueName: \"kubernetes.io/projected/1e22989e-67d8-41f9-acfa-874e304428b5-kube-api-access-h7nm5\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.170212 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170004 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s96jj\" (UniqueName: \"kubernetes.io/projected/64cbfc80-dc30-49cf-a0eb-68130da967eb-kube-api-access-s96jj\") pod \"iptables-alerter-v9kdc\" (UID: \"64cbfc80-dc30-49cf-a0eb-68130da967eb\") " pod="openshift-network-operator/iptables-alerter-v9kdc" Apr 16 18:02:02.170212 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170034 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.170212 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170060 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bcxq\" (UniqueName: \"kubernetes.io/projected/298a93d5-3bc0-4a9d-9dd0-922e60e48669-kube-api-access-6bcxq\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.170212 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170086 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1e22989e-67d8-41f9-acfa-874e304428b5-os-release\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.170212 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170111 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-systemd\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.170212 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170136 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-run-ovn-kubernetes\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.170212 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170161 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1e22989e-67d8-41f9-acfa-874e304428b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.170212 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170185 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-device-dir\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.170768 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170208 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-kubernetes\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.170768 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170254 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs\") pod \"network-metrics-daemon-24pqv\" (UID: \"9d69a3dd-c0fd-4764-b3d8-802189b16640\") " pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:02.170768 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170279 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1e22989e-67d8-41f9-acfa-874e304428b5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.170768 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170329 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-modprobe-d\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.170768 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170429 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-run-openvswitch\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.170768 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170552 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-registration-dir\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.170768 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170582 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scwfc\" (UniqueName: \"kubernetes.io/projected/c3e86810-7277-49b0-aa54-682451b6950a-kube-api-access-scwfc\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.170768 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170607 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-slash\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.170768 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170631 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-log-socket\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.170768 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170656 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-kubelet\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.170768 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170681 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f391f14-a93d-421b-8f8c-642ea53a1269-hosts-file\") pod \"node-resolver-2nlc5\" (UID: \"2f391f14-a93d-421b-8f8c-642ea53a1269\") " pod="openshift-dns/node-resolver-2nlc5" Apr 16 18:02:02.170768 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.170705 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1e22989e-67d8-41f9-acfa-874e304428b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.197664 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.197634 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:57:01 +0000 UTC" deadline="2027-10-27 05:05:42.045012979 +0000 UTC" Apr 16 18:02:02.197664 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.197663 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13403h3m39.847353779s" Apr 16 18:02:02.255859 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.255792 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:02:02.271277 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271247 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-log-socket\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.271407 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271291 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/276ccc08-ca8c-4aca-9ce7-0dcbf63c4c4c-konnectivity-ca\") pod \"konnectivity-agent-5h65c\" (UID: \"276ccc08-ca8c-4aca-9ce7-0dcbf63c4c4c\") " pod="kube-system/konnectivity-agent-5h65c" Apr 16 18:02:02.271407 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271321 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c03c1135-b04a-4662-866b-1180974b6c3e-cni-binary-copy\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.271407 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271351 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-kubelet\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.271407 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271402 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f391f14-a93d-421b-8f8c-642ea53a1269-hosts-file\") pod \"node-resolver-2nlc5\" (UID: \"2f391f14-a93d-421b-8f8c-642ea53a1269\") " pod="openshift-dns/node-resolver-2nlc5" Apr 16 18:02:02.271636 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271428 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1e22989e-67d8-41f9-acfa-874e304428b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.271636 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271462 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grbkn\" (UniqueName: \"kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn\") pod \"network-check-target-tkqz7\" (UID: \"299ac103-77d7-4ae3-b981-8c66d39e67eb\") " pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:02.271636 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271495 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-cnibin\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.271636 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271515 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-os-release\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.271636 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271525 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f391f14-a93d-421b-8f8c-642ea53a1269-hosts-file\") pod \"node-resolver-2nlc5\" (UID: \"2f391f14-a93d-421b-8f8c-642ea53a1269\") " pod="openshift-dns/node-resolver-2nlc5" Apr 16 18:02:02.271636 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271540 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-cni-netd\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.271636 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271563 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-run-netns\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.271636 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271583 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-kubelet\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.271636 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271585 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-var-lib-kubelet\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.271636 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271400 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-log-socket\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.271636 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271637 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-etc-selinux\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.272166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271676 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-sys-fs\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.272166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271701 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-var-lib-kubelet\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.272166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271726 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-var-lib-openvswitch\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.272166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271762 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-var-lib-cni-multus\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.272166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271788 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c03c1135-b04a-4662-866b-1180974b6c3e-multus-daemon-config\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.272166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271820 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-socket-dir\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.272166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271845 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64cbfc80-dc30-49cf-a0eb-68130da967eb-host-slash\") pod \"iptables-alerter-v9kdc\" (UID: \"64cbfc80-dc30-49cf-a0eb-68130da967eb\") " pod="openshift-network-operator/iptables-alerter-v9kdc" Apr 16 18:02:02.272166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271869 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-tmp\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.272166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271906 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55bg4\" (UniqueName: \"kubernetes.io/projected/9d69a3dd-c0fd-4764-b3d8-802189b16640-kube-api-access-55bg4\") pod \"network-metrics-daemon-24pqv\" (UID: \"9d69a3dd-c0fd-4764-b3d8-802189b16640\") " pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:02.272166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271944 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-run-ovn\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.272166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271968 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/298a93d5-3bc0-4a9d-9dd0-922e60e48669-env-overrides\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.272166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.271995 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-run-multus-certs\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.272166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272021 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-sysctl-d\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.272166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272047 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-lib-modules\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.272166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272092 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/298a93d5-3bc0-4a9d-9dd0-922e60e48669-ovn-node-metrics-cert\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.272166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272100 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1e22989e-67d8-41f9-acfa-874e304428b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.272166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272119 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-var-lib-cni-bin\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.272943 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272149 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-sysctl-conf\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.272943 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272151 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-socket-dir\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.272943 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272188 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-node-log\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.272943 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272215 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/298a93d5-3bc0-4a9d-9dd0-922e60e48669-ovnkube-script-lib\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.272943 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272219 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-var-lib-openvswitch\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.272943 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272237 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-sys-fs\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.272943 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272260 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-etc-selinux\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.272943 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272051 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-cni-netd\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.272943 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272281 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64cbfc80-dc30-49cf-a0eb-68130da967eb-host-slash\") pod \"iptables-alerter-v9kdc\" (UID: \"64cbfc80-dc30-49cf-a0eb-68130da967eb\") " pod="openshift-network-operator/iptables-alerter-v9kdc" Apr 16 18:02:02.272943 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272276 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-system-cni-dir\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.272943 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272397 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-lib-modules\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.272943 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272396 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-run\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.272943 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272441 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-run-systemd\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.272943 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272448 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-run\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.272943 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272469 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncpp9\" (UniqueName: \"kubernetes.io/projected/2f391f14-a93d-421b-8f8c-642ea53a1269-kube-api-access-ncpp9\") pod \"node-resolver-2nlc5\" (UID: \"2f391f14-a93d-421b-8f8c-642ea53a1269\") " pod="openshift-dns/node-resolver-2nlc5" Apr 16 18:02:02.272943 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272498 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq6k2\" (UniqueName: \"kubernetes.io/projected/9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b-kube-api-access-dq6k2\") pod \"node-ca-qwbnl\" (UID: \"9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b\") " pod="openshift-image-registry/node-ca-qwbnl" Apr 16 18:02:02.272943 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272527 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-tuned\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272545 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-run-systemd\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272548 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272572 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-sysctl-d\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272598 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-run-ovn\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272645 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-var-lib-kubelet\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272755 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/298a93d5-3bc0-4a9d-9dd0-922e60e48669-env-overrides\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272760 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-sysctl-conf\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272553 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzfz7\" (UniqueName: \"kubernetes.io/projected/c03c1135-b04a-4662-866b-1180974b6c3e-kube-api-access-jzfz7\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272811 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-node-log\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272814 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-host\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272840 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/298a93d5-3bc0-4a9d-9dd0-922e60e48669-ovnkube-script-lib\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272848 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/298a93d5-3bc0-4a9d-9dd0-922e60e48669-ovnkube-config\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272887 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-host\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272900 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1e22989e-67d8-41f9-acfa-874e304428b5-system-cni-dir\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272938 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/276ccc08-ca8c-4aca-9ce7-0dcbf63c4c4c-agent-certs\") pod \"konnectivity-agent-5h65c\" (UID: \"276ccc08-ca8c-4aca-9ce7-0dcbf63c4c4c\") " pod="kube-system/konnectivity-agent-5h65c" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.272976 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-etc-kubernetes\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273006 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.273746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273024 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1e22989e-67d8-41f9-acfa-874e304428b5-system-cni-dir\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.274588 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273035 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/64cbfc80-dc30-49cf-a0eb-68130da967eb-iptables-alerter-script\") pod \"iptables-alerter-v9kdc\" (UID: \"64cbfc80-dc30-49cf-a0eb-68130da967eb\") " pod="openshift-network-operator/iptables-alerter-v9kdc" Apr 16 18:02:02.274588 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273077 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-sysconfig\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.274588 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-sys\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.274588 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273134 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nj2hg\" (UniqueName: \"kubernetes.io/projected/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-kube-api-access-nj2hg\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.274588 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273153 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-sysconfig\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.274588 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273172 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-systemd-units\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.274588 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273198 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-run-netns\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.274588 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273218 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.274588 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-etc-openvswitch\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.274588 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273311 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-sys\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.274588 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273317 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/298a93d5-3bc0-4a9d-9dd0-922e60e48669-ovnkube-config\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.274588 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273401 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-cni-bin\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.274588 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273429 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f391f14-a93d-421b-8f8c-642ea53a1269-tmp-dir\") pod \"node-resolver-2nlc5\" (UID: \"2f391f14-a93d-421b-8f8c-642ea53a1269\") " pod="openshift-dns/node-resolver-2nlc5" Apr 16 18:02:02.274588 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273463 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1e22989e-67d8-41f9-acfa-874e304428b5-cnibin\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.274588 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273491 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1e22989e-67d8-41f9-acfa-874e304428b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.274588 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273519 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7nm5\" (UniqueName: \"kubernetes.io/projected/1e22989e-67d8-41f9-acfa-874e304428b5-kube-api-access-h7nm5\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.274588 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273544 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-cni-bin\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.275396 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273549 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-multus-cni-dir\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.275396 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273600 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-multus-socket-dir-parent\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.275396 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273548 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/64cbfc80-dc30-49cf-a0eb-68130da967eb-iptables-alerter-script\") pod \"iptables-alerter-v9kdc\" (UID: \"64cbfc80-dc30-49cf-a0eb-68130da967eb\") " pod="openshift-network-operator/iptables-alerter-v9kdc" Apr 16 18:02:02.275396 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273635 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s96jj\" (UniqueName: \"kubernetes.io/projected/64cbfc80-dc30-49cf-a0eb-68130da967eb-kube-api-access-s96jj\") pod \"iptables-alerter-v9kdc\" (UID: \"64cbfc80-dc30-49cf-a0eb-68130da967eb\") " pod="openshift-network-operator/iptables-alerter-v9kdc" Apr 16 18:02:02.275396 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273705 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.275396 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273737 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bcxq\" (UniqueName: \"kubernetes.io/projected/298a93d5-3bc0-4a9d-9dd0-922e60e48669-kube-api-access-6bcxq\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.275396 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273763 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1e22989e-67d8-41f9-acfa-874e304428b5-os-release\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.275396 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273793 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b-serviceca\") pod \"node-ca-qwbnl\" (UID: \"9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b\") " pod="openshift-image-registry/node-ca-qwbnl" Apr 16 18:02:02.275396 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273799 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-systemd-units\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.275396 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273822 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-hostroot\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.275396 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273811 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f391f14-a93d-421b-8f8c-642ea53a1269-tmp-dir\") pod \"node-resolver-2nlc5\" (UID: \"2f391f14-a93d-421b-8f8c-642ea53a1269\") " pod="openshift-dns/node-resolver-2nlc5" Apr 16 18:02:02.275396 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273872 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1e22989e-67d8-41f9-acfa-874e304428b5-cnibin\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.275396 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273904 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.275396 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273924 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-run-netns\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.275396 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273979 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1e22989e-67d8-41f9-acfa-874e304428b5-os-release\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.275396 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273993 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-etc-openvswitch\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.275396 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273851 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-systemd\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.276209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274351 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-run-ovn-kubernetes\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.276209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274398 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1e22989e-67d8-41f9-acfa-874e304428b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.276209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274424 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-run-ovn-kubernetes\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.276209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274426 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b-host\") pod \"node-ca-qwbnl\" (UID: \"9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b\") " pod="openshift-image-registry/node-ca-qwbnl" Apr 16 18:02:02.276209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.273990 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-systemd\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.276209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274474 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-run-k8s-cni-cncf-io\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.276209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274531 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-device-dir\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.276209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274561 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-kubernetes\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.276209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274587 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs\") pod \"network-metrics-daemon-24pqv\" (UID: \"9d69a3dd-c0fd-4764-b3d8-802189b16640\") " pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:02.276209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274592 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-device-dir\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.276209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274614 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1e22989e-67d8-41f9-acfa-874e304428b5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.276209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-modprobe-d\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.276209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274651 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-kubernetes\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.276209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274667 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-run-openvswitch\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.276209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274696 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-multus-conf-dir\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.276209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274722 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-registration-dir\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.276209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274706 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1e22989e-67d8-41f9-acfa-874e304428b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.277085 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274752 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scwfc\" (UniqueName: \"kubernetes.io/projected/c3e86810-7277-49b0-aa54-682451b6950a-kube-api-access-scwfc\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.277085 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274778 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-slash\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.277085 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274790 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-run-openvswitch\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.277085 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274852 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/298a93d5-3bc0-4a9d-9dd0-922e60e48669-host-slash\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.277085 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274877 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1e22989e-67d8-41f9-acfa-874e304428b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.277085 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274883 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-modprobe-d\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.277085 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.274921 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c3e86810-7277-49b0-aa54-682451b6950a-registration-dir\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.277085 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:02.274995 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:02.277085 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.275062 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1e22989e-67d8-41f9-acfa-874e304428b5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.277085 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:02.275068 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs podName:9d69a3dd-c0fd-4764-b3d8-802189b16640 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:02.775047664 +0000 UTC m=+3.052461538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs") pod "network-metrics-daemon-24pqv" (UID: "9d69a3dd-c0fd-4764-b3d8-802189b16640") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:02.277085 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.275640 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-etc-tuned\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.277085 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.275904 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-tmp\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.277085 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.276689 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/298a93d5-3bc0-4a9d-9dd0-922e60e48669-ovn-node-metrics-cert\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.283658 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.283456 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncpp9\" (UniqueName: \"kubernetes.io/projected/2f391f14-a93d-421b-8f8c-642ea53a1269-kube-api-access-ncpp9\") pod \"node-resolver-2nlc5\" (UID: \"2f391f14-a93d-421b-8f8c-642ea53a1269\") " pod="openshift-dns/node-resolver-2nlc5" Apr 16 18:02:02.286022 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.285998 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bcxq\" (UniqueName: \"kubernetes.io/projected/298a93d5-3bc0-4a9d-9dd0-922e60e48669-kube-api-access-6bcxq\") pod \"ovnkube-node-mv8js\" (UID: \"298a93d5-3bc0-4a9d-9dd0-922e60e48669\") " pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.286770 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.286721 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scwfc\" (UniqueName: \"kubernetes.io/projected/c3e86810-7277-49b0-aa54-682451b6950a-kube-api-access-scwfc\") pod \"aws-ebs-csi-driver-node-k9l7q\" (UID: \"c3e86810-7277-49b0-aa54-682451b6950a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.287830 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.287229 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s96jj\" (UniqueName: \"kubernetes.io/projected/64cbfc80-dc30-49cf-a0eb-68130da967eb-kube-api-access-s96jj\") pod \"iptables-alerter-v9kdc\" (UID: \"64cbfc80-dc30-49cf-a0eb-68130da967eb\") " pod="openshift-network-operator/iptables-alerter-v9kdc" Apr 16 18:02:02.289212 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.289139 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7nm5\" (UniqueName: \"kubernetes.io/projected/1e22989e-67d8-41f9-acfa-874e304428b5-kube-api-access-h7nm5\") pod \"multus-additional-cni-plugins-frw6p\" (UID: \"1e22989e-67d8-41f9-acfa-874e304428b5\") " pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.290645 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.290533 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj2hg\" (UniqueName: \"kubernetes.io/projected/1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e-kube-api-access-nj2hg\") pod \"tuned-2mtr8\" (UID: \"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e\") " pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.290645 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.290544 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55bg4\" (UniqueName: \"kubernetes.io/projected/9d69a3dd-c0fd-4764-b3d8-802189b16640-kube-api-access-55bg4\") pod \"network-metrics-daemon-24pqv\" (UID: \"9d69a3dd-c0fd-4764-b3d8-802189b16640\") " pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:02.295070 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.295043 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" event={"ID":"96d5b24163c5b9fd60b620b912892f8b","Type":"ContainerStarted","Data":"63618fa95a4609180018a3b20a5b1f5834f178608dee5da961030e2b906bfc32"} Apr 16 18:02:02.375757 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.375725 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-run-multus-certs\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.375870 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.375769 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-var-lib-cni-bin\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.375870 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.375836 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-run-multus-certs\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.375970 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.375898 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-var-lib-cni-bin\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.375970 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.375941 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-system-cni-dir\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.376058 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.375972 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq6k2\" (UniqueName: \"kubernetes.io/projected/9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b-kube-api-access-dq6k2\") pod \"node-ca-qwbnl\" (UID: \"9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b\") " pod="openshift-image-registry/node-ca-qwbnl" Apr 16 18:02:02.376058 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376001 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jzfz7\" (UniqueName: \"kubernetes.io/projected/c03c1135-b04a-4662-866b-1180974b6c3e-kube-api-access-jzfz7\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.376149 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376057 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-system-cni-dir\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.376199 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376163 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/276ccc08-ca8c-4aca-9ce7-0dcbf63c4c4c-agent-certs\") pod \"konnectivity-agent-5h65c\" (UID: \"276ccc08-ca8c-4aca-9ce7-0dcbf63c4c4c\") " pod="kube-system/konnectivity-agent-5h65c" Apr 16 18:02:02.376199 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376193 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-etc-kubernetes\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.376319 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376231 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-multus-cni-dir\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.376319 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376272 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-multus-socket-dir-parent\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.376319 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376305 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-multus-cni-dir\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.376487 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376316 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-etc-kubernetes\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.376487 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376363 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-multus-socket-dir-parent\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.376487 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376308 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b-serviceca\") pod \"node-ca-qwbnl\" (UID: \"9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b\") " pod="openshift-image-registry/node-ca-qwbnl" Apr 16 18:02:02.376487 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376441 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-hostroot\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.376487 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376470 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b-host\") pod \"node-ca-qwbnl\" (UID: \"9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b\") " pod="openshift-image-registry/node-ca-qwbnl" Apr 16 18:02:02.376720 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376494 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-run-k8s-cni-cncf-io\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.376720 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376544 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-multus-conf-dir\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.376720 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376575 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/276ccc08-ca8c-4aca-9ce7-0dcbf63c4c4c-konnectivity-ca\") pod \"konnectivity-agent-5h65c\" (UID: \"276ccc08-ca8c-4aca-9ce7-0dcbf63c4c4c\") " pod="kube-system/konnectivity-agent-5h65c" Apr 16 18:02:02.376720 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376601 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c03c1135-b04a-4662-866b-1180974b6c3e-cni-binary-copy\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.376720 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376623 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b-host\") pod \"node-ca-qwbnl\" (UID: \"9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b\") " pod="openshift-image-registry/node-ca-qwbnl" Apr 16 18:02:02.376720 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376632 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grbkn\" (UniqueName: \"kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn\") pod \"network-check-target-tkqz7\" (UID: \"299ac103-77d7-4ae3-b981-8c66d39e67eb\") " pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:02.376720 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376658 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-cnibin\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.376720 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376675 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-hostroot\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.376720 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376682 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-os-release\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.376720 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376709 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-run-netns\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.377196 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376732 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b-serviceca\") pod \"node-ca-qwbnl\" (UID: \"9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b\") " pod="openshift-image-registry/node-ca-qwbnl" Apr 16 18:02:02.377196 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376733 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-var-lib-kubelet\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.377196 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376759 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-var-lib-kubelet\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.377196 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376827 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-run-k8s-cni-cncf-io\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.377196 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376889 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-os-release\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.377196 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376939 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-multus-conf-dir\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.377196 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.376980 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-run-netns\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.377196 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.377011 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-var-lib-cni-multus\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.377196 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.377037 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c03c1135-b04a-4662-866b-1180974b6c3e-multus-daemon-config\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.377196 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.377096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/276ccc08-ca8c-4aca-9ce7-0dcbf63c4c4c-konnectivity-ca\") pod \"konnectivity-agent-5h65c\" (UID: \"276ccc08-ca8c-4aca-9ce7-0dcbf63c4c4c\") " pod="kube-system/konnectivity-agent-5h65c" Apr 16 18:02:02.377196 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.377116 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-cnibin\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.377196 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.377150 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c03c1135-b04a-4662-866b-1180974b6c3e-host-var-lib-cni-multus\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.377841 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.377242 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c03c1135-b04a-4662-866b-1180974b6c3e-cni-binary-copy\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.378099 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.378078 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c03c1135-b04a-4662-866b-1180974b6c3e-multus-daemon-config\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.378727 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.378708 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/276ccc08-ca8c-4aca-9ce7-0dcbf63c4c4c-agent-certs\") pod \"konnectivity-agent-5h65c\" (UID: \"276ccc08-ca8c-4aca-9ce7-0dcbf63c4c4c\") " pod="kube-system/konnectivity-agent-5h65c" Apr 16 18:02:02.381826 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:02.381806 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:02.381826 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:02.381829 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:02.381966 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:02.381841 2574 projected.go:194] Error preparing data for projected volume kube-api-access-grbkn for pod openshift-network-diagnostics/network-check-target-tkqz7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:02.381966 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:02.381898 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn podName:299ac103-77d7-4ae3-b981-8c66d39e67eb nodeName:}" failed. No retries permitted until 2026-04-16 18:02:02.881883563 +0000 UTC m=+3.159297445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-grbkn" (UniqueName: "kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn") pod "network-check-target-tkqz7" (UID: "299ac103-77d7-4ae3-b981-8c66d39e67eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:02.383979 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.383961 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzfz7\" (UniqueName: \"kubernetes.io/projected/c03c1135-b04a-4662-866b-1180974b6c3e-kube-api-access-jzfz7\") pod \"multus-w2x27\" (UID: \"c03c1135-b04a-4662-866b-1180974b6c3e\") " pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.384076 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.384015 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq6k2\" (UniqueName: \"kubernetes.io/projected/9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b-kube-api-access-dq6k2\") pod \"node-ca-qwbnl\" (UID: \"9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b\") " pod="openshift-image-registry/node-ca-qwbnl" Apr 16 18:02:02.458879 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.458855 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" Apr 16 18:02:02.470543 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.470491 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-v9kdc" Apr 16 18:02:02.478150 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.478131 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:02.485728 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.485711 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" Apr 16 18:02:02.492254 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.492235 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2nlc5" Apr 16 18:02:02.504777 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.504756 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-frw6p" Apr 16 18:02:02.513360 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.513341 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5h65c" Apr 16 18:02:02.518970 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.518952 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qwbnl" Apr 16 18:02:02.524471 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.524455 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w2x27" Apr 16 18:02:02.781243 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.781175 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs\") pod \"network-metrics-daemon-24pqv\" (UID: \"9d69a3dd-c0fd-4764-b3d8-802189b16640\") " pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:02.781399 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:02.781312 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:02.781399 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:02.781393 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs podName:9d69a3dd-c0fd-4764-b3d8-802189b16640 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:03.781360481 +0000 UTC m=+4.058774369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs") pod "network-metrics-daemon-24pqv" (UID: "9d69a3dd-c0fd-4764-b3d8-802189b16640") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:02.982185 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:02.982151 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grbkn\" (UniqueName: \"kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn\") pod \"network-check-target-tkqz7\" (UID: \"299ac103-77d7-4ae3-b981-8c66d39e67eb\") " pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:02.982349 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:02.982291 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:02.982349 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:02.982309 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:02.982349 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:02.982321 2574 projected.go:194] Error preparing data for projected volume kube-api-access-grbkn for pod openshift-network-diagnostics/network-check-target-tkqz7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:02.982529 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:02.982399 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn podName:299ac103-77d7-4ae3-b981-8c66d39e67eb nodeName:}" failed. No retries permitted until 2026-04-16 18:02:03.982362459 +0000 UTC m=+4.259776331 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-grbkn" (UniqueName: "kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn") pod "network-check-target-tkqz7" (UID: "299ac103-77d7-4ae3-b981-8c66d39e67eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:03.062680 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:03.062550 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a04ff9e_4799_4b81_b4f0_2fcae1f80a7e.slice/crio-85c50819968095371cd8cd23feb0847a6361392c5d426b9fb43db3a3a5f05fd6 WatchSource:0}: Error finding container 85c50819968095371cd8cd23feb0847a6361392c5d426b9fb43db3a3a5f05fd6: Status 404 returned error can't find the container with id 85c50819968095371cd8cd23feb0847a6361392c5d426b9fb43db3a3a5f05fd6 Apr 16 18:02:03.066876 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:03.066857 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod276ccc08_ca8c_4aca_9ce7_0dcbf63c4c4c.slice/crio-085e731b005d724aad33a09f3456933a5fd39b8e4f0540a240639e34a1b81c95 WatchSource:0}: Error finding container 085e731b005d724aad33a09f3456933a5fd39b8e4f0540a240639e34a1b81c95: Status 404 returned error can't find the container with id 085e731b005d724aad33a09f3456933a5fd39b8e4f0540a240639e34a1b81c95 Apr 16 18:02:03.068648 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:03.068622 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc03c1135_b04a_4662_866b_1180974b6c3e.slice/crio-d06555087b87d2d670faa8748735b7c829f35b731f71364b4136589ed6254d1c WatchSource:0}: Error finding container d06555087b87d2d670faa8748735b7c829f35b731f71364b4136589ed6254d1c: Status 404 returned error can't find the container with id d06555087b87d2d670faa8748735b7c829f35b731f71364b4136589ed6254d1c Apr 16 18:02:03.069047 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:03.069027 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f391f14_a93d_421b_8f8c_642ea53a1269.slice/crio-9c8cfa6ed76fc40167cc574797dfd814d7644b052710fcf2a1f639cada2f0d32 WatchSource:0}: Error finding container 9c8cfa6ed76fc40167cc574797dfd814d7644b052710fcf2a1f639cada2f0d32: Status 404 returned error can't find the container with id 9c8cfa6ed76fc40167cc574797dfd814d7644b052710fcf2a1f639cada2f0d32 Apr 16 18:02:03.069698 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:03.069678 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod298a93d5_3bc0_4a9d_9dd0_922e60e48669.slice/crio-9637091c136501c83b91dd74069526158ab6b3fbabd16c10b5834716ecda05c4 WatchSource:0}: Error finding container 9637091c136501c83b91dd74069526158ab6b3fbabd16c10b5834716ecda05c4: Status 404 returned error can't find the container with id 9637091c136501c83b91dd74069526158ab6b3fbabd16c10b5834716ecda05c4 Apr 16 18:02:03.071035 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:03.070729 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64cbfc80_dc30_49cf_a0eb_68130da967eb.slice/crio-62e9a01a503392e7e76bf440bf38b8b6f48cbeaca53398fd5f173c8ba1f31225 WatchSource:0}: Error finding container 62e9a01a503392e7e76bf440bf38b8b6f48cbeaca53398fd5f173c8ba1f31225: Status 404 returned error can't find the container with id 62e9a01a503392e7e76bf440bf38b8b6f48cbeaca53398fd5f173c8ba1f31225 Apr 16 18:02:03.072137 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:03.072089 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3e86810_7277_49b0_aa54_682451b6950a.slice/crio-527660916ddcb434feb0dd40ee38176dbd3a921477860c243aeec1444f7da6d2 WatchSource:0}: Error finding container 527660916ddcb434feb0dd40ee38176dbd3a921477860c243aeec1444f7da6d2: Status 404 returned error can't find the container with id 527660916ddcb434feb0dd40ee38176dbd3a921477860c243aeec1444f7da6d2 Apr 16 18:02:03.072486 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:03.072461 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e22989e_67d8_41f9_acfa_874e304428b5.slice/crio-69c2b7c4204f43e06afeb2084db47133eb20e7e048620e0954fc4852fe51ced6 WatchSource:0}: Error finding container 69c2b7c4204f43e06afeb2084db47133eb20e7e048620e0954fc4852fe51ced6: Status 404 returned error can't find the container with id 69c2b7c4204f43e06afeb2084db47133eb20e7e048620e0954fc4852fe51ced6 Apr 16 18:02:03.074304 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:03.074245 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d5a1448_8fde_4017_8bb4_c60dbbbe2a1b.slice/crio-a4869c6f5721d505e33fdd23112b7ed2096463db2b71e295c892739c7c321b2e WatchSource:0}: Error finding container a4869c6f5721d505e33fdd23112b7ed2096463db2b71e295c892739c7c321b2e: Status 404 returned error can't find the container with id a4869c6f5721d505e33fdd23112b7ed2096463db2b71e295c892739c7c321b2e Apr 16 18:02:03.198435 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:03.198295 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:57:01 +0000 UTC" deadline="2027-11-05 13:54:43.110195174 +0000 UTC" Apr 16 18:02:03.198435 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:03.198433 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13627h52m39.911765643s" Apr 16 18:02:03.297418 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:03.297392 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qwbnl" event={"ID":"9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b","Type":"ContainerStarted","Data":"a4869c6f5721d505e33fdd23112b7ed2096463db2b71e295c892739c7c321b2e"} Apr 16 18:02:03.298298 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:03.298277 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-frw6p" event={"ID":"1e22989e-67d8-41f9-acfa-874e304428b5","Type":"ContainerStarted","Data":"69c2b7c4204f43e06afeb2084db47133eb20e7e048620e0954fc4852fe51ced6"} Apr 16 18:02:03.299266 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:03.299246 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" event={"ID":"c3e86810-7277-49b0-aa54-682451b6950a","Type":"ContainerStarted","Data":"527660916ddcb434feb0dd40ee38176dbd3a921477860c243aeec1444f7da6d2"} Apr 16 18:02:03.300171 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:03.300150 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2nlc5" event={"ID":"2f391f14-a93d-421b-8f8c-642ea53a1269","Type":"ContainerStarted","Data":"9c8cfa6ed76fc40167cc574797dfd814d7644b052710fcf2a1f639cada2f0d32"} Apr 16 18:02:03.301189 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:03.301170 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5h65c" event={"ID":"276ccc08-ca8c-4aca-9ce7-0dcbf63c4c4c","Type":"ContainerStarted","Data":"085e731b005d724aad33a09f3456933a5fd39b8e4f0540a240639e34a1b81c95"} Apr 16 18:02:03.302708 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:03.302678 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" event={"ID":"ede70444dfe8774ee4b88dac28542853","Type":"ContainerStarted","Data":"85f83529fa69a9a68e2a051247d47e5dcc9cb48952eb0e1c79e185c0e84e73eb"} Apr 16 18:02:03.303754 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:03.303683 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-v9kdc" event={"ID":"64cbfc80-dc30-49cf-a0eb-68130da967eb","Type":"ContainerStarted","Data":"62e9a01a503392e7e76bf440bf38b8b6f48cbeaca53398fd5f173c8ba1f31225"} Apr 16 18:02:03.304754 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:03.304734 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" event={"ID":"298a93d5-3bc0-4a9d-9dd0-922e60e48669","Type":"ContainerStarted","Data":"9637091c136501c83b91dd74069526158ab6b3fbabd16c10b5834716ecda05c4"} Apr 16 18:02:03.305691 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:03.305666 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w2x27" event={"ID":"c03c1135-b04a-4662-866b-1180974b6c3e","Type":"ContainerStarted","Data":"d06555087b87d2d670faa8748735b7c829f35b731f71364b4136589ed6254d1c"} Apr 16 18:02:03.306696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:03.306672 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" event={"ID":"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e","Type":"ContainerStarted","Data":"85c50819968095371cd8cd23feb0847a6361392c5d426b9fb43db3a3a5f05fd6"} Apr 16 18:02:03.314428 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:03.314361 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" podStartSLOduration=2.314348856 podStartE2EDuration="2.314348856s" podCreationTimestamp="2026-04-16 18:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:02:03.31383598 +0000 UTC m=+3.591249875" watchObservedRunningTime="2026-04-16 18:02:03.314348856 +0000 UTC m=+3.591762751" Apr 16 18:02:03.787714 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:03.787615 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs\") pod \"network-metrics-daemon-24pqv\" (UID: \"9d69a3dd-c0fd-4764-b3d8-802189b16640\") " pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:03.787871 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:03.787755 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:03.787871 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:03.787821 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs podName:9d69a3dd-c0fd-4764-b3d8-802189b16640 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:05.78780227 +0000 UTC m=+6.065216150 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs") pod "network-metrics-daemon-24pqv" (UID: "9d69a3dd-c0fd-4764-b3d8-802189b16640") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:03.989085 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:03.989047 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grbkn\" (UniqueName: \"kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn\") pod \"network-check-target-tkqz7\" (UID: \"299ac103-77d7-4ae3-b981-8c66d39e67eb\") " pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:03.989257 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:03.989240 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:03.989332 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:03.989264 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:03.989332 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:03.989278 2574 projected.go:194] Error preparing data for projected volume kube-api-access-grbkn for pod openshift-network-diagnostics/network-check-target-tkqz7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:03.989456 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:03.989332 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn podName:299ac103-77d7-4ae3-b981-8c66d39e67eb nodeName:}" failed. No retries permitted until 2026-04-16 18:02:05.989313671 +0000 UTC m=+6.266727562 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-grbkn" (UniqueName: "kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn") pod "network-check-target-tkqz7" (UID: "299ac103-77d7-4ae3-b981-8c66d39e67eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:04.291665 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:04.291635 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:04.292166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:04.292149 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:04.292273 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:04.292250 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pqv" podUID="9d69a3dd-c0fd-4764-b3d8-802189b16640" Apr 16 18:02:04.294534 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:04.294504 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkqz7" podUID="299ac103-77d7-4ae3-b981-8c66d39e67eb" Apr 16 18:02:04.322915 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:04.322843 2574 generic.go:358] "Generic (PLEG): container finished" podID="96d5b24163c5b9fd60b620b912892f8b" containerID="326ec30a5952f864ecd61d134a755c5474ec8d97fb9d68f0f9bb942e6583a17e" exitCode=0 Apr 16 18:02:04.323703 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:04.323679 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" event={"ID":"96d5b24163c5b9fd60b620b912892f8b","Type":"ContainerDied","Data":"326ec30a5952f864ecd61d134a755c5474ec8d97fb9d68f0f9bb942e6583a17e"} Apr 16 18:02:05.345243 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:05.345213 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" event={"ID":"96d5b24163c5b9fd60b620b912892f8b","Type":"ContainerStarted","Data":"4219d337cc93830dd594482f4230a5b4b0f29c4fc8774e9141a6b3a267663563"} Apr 16 18:02:05.807006 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:05.806925 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs\") pod \"network-metrics-daemon-24pqv\" (UID: \"9d69a3dd-c0fd-4764-b3d8-802189b16640\") " pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:05.807163 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:05.807089 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:05.807163 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:05.807158 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs podName:9d69a3dd-c0fd-4764-b3d8-802189b16640 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:09.807140321 +0000 UTC m=+10.084554195 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs") pod "network-metrics-daemon-24pqv" (UID: "9d69a3dd-c0fd-4764-b3d8-802189b16640") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:06.009117 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:06.009080 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grbkn\" (UniqueName: \"kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn\") pod \"network-check-target-tkqz7\" (UID: \"299ac103-77d7-4ae3-b981-8c66d39e67eb\") " pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:06.009292 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:06.009253 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:06.009292 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:06.009272 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:06.009292 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:06.009284 2574 projected.go:194] Error preparing data for projected volume kube-api-access-grbkn for pod openshift-network-diagnostics/network-check-target-tkqz7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:06.009491 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:06.009341 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn podName:299ac103-77d7-4ae3-b981-8c66d39e67eb nodeName:}" failed. No retries permitted until 2026-04-16 18:02:10.009322854 +0000 UTC m=+10.286736728 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-grbkn" (UniqueName: "kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn") pod "network-check-target-tkqz7" (UID: "299ac103-77d7-4ae3-b981-8c66d39e67eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:06.290754 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:06.290613 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:06.290754 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:06.290651 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:06.290953 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:06.290749 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pqv" podUID="9d69a3dd-c0fd-4764-b3d8-802189b16640" Apr 16 18:02:06.291199 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:06.291144 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkqz7" podUID="299ac103-77d7-4ae3-b981-8c66d39e67eb" Apr 16 18:02:08.291129 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:08.291095 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:08.291129 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:08.291111 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:08.291629 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:08.291217 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkqz7" podUID="299ac103-77d7-4ae3-b981-8c66d39e67eb" Apr 16 18:02:08.291629 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:08.291514 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pqv" podUID="9d69a3dd-c0fd-4764-b3d8-802189b16640" Apr 16 18:02:09.838998 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:09.838960 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs\") pod \"network-metrics-daemon-24pqv\" (UID: \"9d69a3dd-c0fd-4764-b3d8-802189b16640\") " pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:09.839463 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:09.839080 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:09.839463 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:09.839126 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs podName:9d69a3dd-c0fd-4764-b3d8-802189b16640 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:17.839113787 +0000 UTC m=+18.116527659 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs") pod "network-metrics-daemon-24pqv" (UID: "9d69a3dd-c0fd-4764-b3d8-802189b16640") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:10.040749 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:10.040713 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grbkn\" (UniqueName: \"kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn\") pod \"network-check-target-tkqz7\" (UID: \"299ac103-77d7-4ae3-b981-8c66d39e67eb\") " pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:10.040911 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:10.040864 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:10.040911 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:10.040882 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:10.040911 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:10.040894 2574 projected.go:194] Error preparing data for projected volume kube-api-access-grbkn for pod openshift-network-diagnostics/network-check-target-tkqz7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:10.041029 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:10.040962 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn podName:299ac103-77d7-4ae3-b981-8c66d39e67eb nodeName:}" failed. No retries permitted until 2026-04-16 18:02:18.040943936 +0000 UTC m=+18.318357813 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-grbkn" (UniqueName: "kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn") pod "network-check-target-tkqz7" (UID: "299ac103-77d7-4ae3-b981-8c66d39e67eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:10.292107 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:10.292069 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:10.292295 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:10.292176 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkqz7" podUID="299ac103-77d7-4ae3-b981-8c66d39e67eb" Apr 16 18:02:10.292443 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:10.292420 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:10.292558 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:10.292538 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pqv" podUID="9d69a3dd-c0fd-4764-b3d8-802189b16640" Apr 16 18:02:12.291875 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:12.291109 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:12.291875 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:12.291143 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:12.291875 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:12.291266 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pqv" podUID="9d69a3dd-c0fd-4764-b3d8-802189b16640" Apr 16 18:02:12.291875 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:12.291401 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkqz7" podUID="299ac103-77d7-4ae3-b981-8c66d39e67eb" Apr 16 18:02:14.290735 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:14.290703 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:14.291165 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:14.290718 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:14.291165 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:14.290834 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pqv" podUID="9d69a3dd-c0fd-4764-b3d8-802189b16640" Apr 16 18:02:14.291165 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:14.290873 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkqz7" podUID="299ac103-77d7-4ae3-b981-8c66d39e67eb" Apr 16 18:02:15.968969 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:15.968909 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" podStartSLOduration=14.968893609 podStartE2EDuration="14.968893609s" podCreationTimestamp="2026-04-16 18:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:02:05.358759865 +0000 UTC m=+5.636173761" watchObservedRunningTime="2026-04-16 18:02:15.968893609 +0000 UTC m=+16.246307528" Apr 16 18:02:15.969578 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:15.969559 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gtd4f"] Apr 16 18:02:15.984663 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:15.984634 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:15.984789 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:15.984718 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtd4f" podUID="5b5f0a92-12a8-4b64-989f-c8db774a38c1" Apr 16 18:02:16.082911 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:16.082877 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5b5f0a92-12a8-4b64-989f-c8db774a38c1-kubelet-config\") pod \"global-pull-secret-syncer-gtd4f\" (UID: \"5b5f0a92-12a8-4b64-989f-c8db774a38c1\") " pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:16.083048 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:16.082925 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret\") pod \"global-pull-secret-syncer-gtd4f\" (UID: \"5b5f0a92-12a8-4b64-989f-c8db774a38c1\") " pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:16.083105 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:16.083060 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5b5f0a92-12a8-4b64-989f-c8db774a38c1-dbus\") pod \"global-pull-secret-syncer-gtd4f\" (UID: \"5b5f0a92-12a8-4b64-989f-c8db774a38c1\") " pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:16.183834 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:16.183799 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5b5f0a92-12a8-4b64-989f-c8db774a38c1-kubelet-config\") pod \"global-pull-secret-syncer-gtd4f\" (UID: \"5b5f0a92-12a8-4b64-989f-c8db774a38c1\") " pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:16.183973 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:16.183846 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret\") pod \"global-pull-secret-syncer-gtd4f\" (UID: \"5b5f0a92-12a8-4b64-989f-c8db774a38c1\") " pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:16.183973 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:16.183910 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5b5f0a92-12a8-4b64-989f-c8db774a38c1-dbus\") pod \"global-pull-secret-syncer-gtd4f\" (UID: \"5b5f0a92-12a8-4b64-989f-c8db774a38c1\") " pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:16.183973 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:16.183934 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5b5f0a92-12a8-4b64-989f-c8db774a38c1-kubelet-config\") pod \"global-pull-secret-syncer-gtd4f\" (UID: \"5b5f0a92-12a8-4b64-989f-c8db774a38c1\") " pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:16.184136 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:16.184044 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:16.184136 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:16.184099 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5b5f0a92-12a8-4b64-989f-c8db774a38c1-dbus\") pod \"global-pull-secret-syncer-gtd4f\" (UID: \"5b5f0a92-12a8-4b64-989f-c8db774a38c1\") " pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:16.184136 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:16.184103 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret podName:5b5f0a92-12a8-4b64-989f-c8db774a38c1 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:16.684084022 +0000 UTC m=+16.961497907 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret") pod "global-pull-secret-syncer-gtd4f" (UID: "5b5f0a92-12a8-4b64-989f-c8db774a38c1") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:16.290968 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:16.290898 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:16.291101 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:16.290898 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:16.291101 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:16.291019 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkqz7" podUID="299ac103-77d7-4ae3-b981-8c66d39e67eb" Apr 16 18:02:16.291209 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:16.291128 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pqv" podUID="9d69a3dd-c0fd-4764-b3d8-802189b16640" Apr 16 18:02:16.688159 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:16.688127 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret\") pod \"global-pull-secret-syncer-gtd4f\" (UID: \"5b5f0a92-12a8-4b64-989f-c8db774a38c1\") " pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:16.688324 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:16.688253 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:16.688324 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:16.688320 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret podName:5b5f0a92-12a8-4b64-989f-c8db774a38c1 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:17.688301784 +0000 UTC m=+17.965715659 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret") pod "global-pull-secret-syncer-gtd4f" (UID: "5b5f0a92-12a8-4b64-989f-c8db774a38c1") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:17.291440 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:17.291404 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:17.291841 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:17.291526 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtd4f" podUID="5b5f0a92-12a8-4b64-989f-c8db774a38c1" Apr 16 18:02:17.695268 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:17.695239 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret\") pod \"global-pull-secret-syncer-gtd4f\" (UID: \"5b5f0a92-12a8-4b64-989f-c8db774a38c1\") " pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:17.695528 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:17.695420 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:17.695528 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:17.695496 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret podName:5b5f0a92-12a8-4b64-989f-c8db774a38c1 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:19.695475436 +0000 UTC m=+19.972889321 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret") pod "global-pull-secret-syncer-gtd4f" (UID: "5b5f0a92-12a8-4b64-989f-c8db774a38c1") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:17.896060 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:17.896021 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs\") pod \"network-metrics-daemon-24pqv\" (UID: \"9d69a3dd-c0fd-4764-b3d8-802189b16640\") " pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:17.896341 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:17.896188 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:17.896341 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:17.896260 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs podName:9d69a3dd-c0fd-4764-b3d8-802189b16640 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:33.896239472 +0000 UTC m=+34.173653359 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs") pod "network-metrics-daemon-24pqv" (UID: "9d69a3dd-c0fd-4764-b3d8-802189b16640") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:18.096842 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:18.096756 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grbkn\" (UniqueName: \"kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn\") pod \"network-check-target-tkqz7\" (UID: \"299ac103-77d7-4ae3-b981-8c66d39e67eb\") " pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:18.097008 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:18.096921 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:18.097008 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:18.096941 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:18.097008 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:18.096953 2574 projected.go:194] Error preparing data for projected volume kube-api-access-grbkn for pod openshift-network-diagnostics/network-check-target-tkqz7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:18.097225 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:18.097021 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn podName:299ac103-77d7-4ae3-b981-8c66d39e67eb nodeName:}" failed. No retries permitted until 2026-04-16 18:02:34.097002594 +0000 UTC m=+34.374416466 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-grbkn" (UniqueName: "kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn") pod "network-check-target-tkqz7" (UID: "299ac103-77d7-4ae3-b981-8c66d39e67eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:18.291628 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:18.291583 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:18.291628 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:18.291604 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:18.292102 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:18.291721 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pqv" podUID="9d69a3dd-c0fd-4764-b3d8-802189b16640" Apr 16 18:02:18.292102 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:18.291843 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkqz7" podUID="299ac103-77d7-4ae3-b981-8c66d39e67eb" Apr 16 18:02:19.291461 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:19.291431 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:19.291615 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:19.291533 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtd4f" podUID="5b5f0a92-12a8-4b64-989f-c8db774a38c1" Apr 16 18:02:19.710119 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:19.710090 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret\") pod \"global-pull-secret-syncer-gtd4f\" (UID: \"5b5f0a92-12a8-4b64-989f-c8db774a38c1\") " pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:19.710572 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:19.710253 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:19.710572 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:19.710329 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret podName:5b5f0a92-12a8-4b64-989f-c8db774a38c1 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:23.710310053 +0000 UTC m=+23.987723929 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret") pod "global-pull-secret-syncer-gtd4f" (UID: "5b5f0a92-12a8-4b64-989f-c8db774a38c1") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:20.291775 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:20.291623 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:20.291874 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:20.291816 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pqv" podUID="9d69a3dd-c0fd-4764-b3d8-802189b16640" Apr 16 18:02:20.291934 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:20.291696 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:20.292067 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:20.292051 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkqz7" podUID="299ac103-77d7-4ae3-b981-8c66d39e67eb" Apr 16 18:02:20.369240 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:20.369206 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qwbnl" event={"ID":"9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b","Type":"ContainerStarted","Data":"4ec4385cc0ac3ddc5f753ba2512d55c4981de0f7d69ea371a48c2ddfa44182f8"} Apr 16 18:02:20.370440 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:20.370412 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-frw6p" event={"ID":"1e22989e-67d8-41f9-acfa-874e304428b5","Type":"ContainerStarted","Data":"131afbb1ba3cd0d8549436c1db2fca0df63718774abc1a7ecb36bd2ee1f40222"} Apr 16 18:02:20.371685 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:20.371659 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" event={"ID":"c3e86810-7277-49b0-aa54-682451b6950a","Type":"ContainerStarted","Data":"328605ba8779a76828f595671390614cd449d3d7d69a3dfe167ec70b8d6aca92"} Apr 16 18:02:20.372953 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:20.372923 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2nlc5" event={"ID":"2f391f14-a93d-421b-8f8c-642ea53a1269","Type":"ContainerStarted","Data":"9a9933dfa059715b8c2d765df60ccb71e683cb612798a00d5407713393d6bdaf"} Apr 16 18:02:20.374069 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:20.374041 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5h65c" event={"ID":"276ccc08-ca8c-4aca-9ce7-0dcbf63c4c4c","Type":"ContainerStarted","Data":"b6f2e120a7ad567408bf7fd1ffb2c29810b23af07bc23d1d164847fff3c2604f"} Apr 16 18:02:20.375327 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:20.375308 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w2x27" event={"ID":"c03c1135-b04a-4662-866b-1180974b6c3e","Type":"ContainerStarted","Data":"13a5cccbc16686f3abc2fe45e4ba03f2a6470ed44fae7ad4cf75f4e7f3211eb0"} Apr 16 18:02:20.376462 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:20.376442 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" event={"ID":"1a04ff9e-4799-4b81-b4f0-2fcae1f80a7e","Type":"ContainerStarted","Data":"66a0ef5256abb8a38d184e8207e8e56a849d4ed15945413fd0e3baa08f86fb7b"} Apr 16 18:02:20.415273 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:20.415236 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qwbnl" podStartSLOduration=8.013413756 podStartE2EDuration="20.415224091s" podCreationTimestamp="2026-04-16 18:02:00 +0000 UTC" firstStartedPulling="2026-04-16 18:02:03.075499395 +0000 UTC m=+3.352913267" lastFinishedPulling="2026-04-16 18:02:15.477309725 +0000 UTC m=+15.754723602" observedRunningTime="2026-04-16 18:02:20.392693002 +0000 UTC m=+20.670106908" watchObservedRunningTime="2026-04-16 18:02:20.415224091 +0000 UTC m=+20.692637985" Apr 16 18:02:20.415618 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:20.415593 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-2mtr8" podStartSLOduration=3.527832106 podStartE2EDuration="20.415586087s" podCreationTimestamp="2026-04-16 18:02:00 +0000 UTC" firstStartedPulling="2026-04-16 18:02:03.064993737 +0000 UTC m=+3.342407609" lastFinishedPulling="2026-04-16 18:02:19.952747705 +0000 UTC m=+20.230161590" observedRunningTime="2026-04-16 18:02:20.414865083 +0000 UTC m=+20.692278971" watchObservedRunningTime="2026-04-16 18:02:20.415586087 +0000 UTC m=+20.692999981" Apr 16 18:02:20.450255 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:20.450224 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-5h65c" podStartSLOduration=3.7200812450000003 podStartE2EDuration="20.450212095s" podCreationTimestamp="2026-04-16 18:02:00 +0000 UTC" firstStartedPulling="2026-04-16 18:02:03.068700978 +0000 UTC m=+3.346114856" lastFinishedPulling="2026-04-16 18:02:19.79883182 +0000 UTC m=+20.076245706" observedRunningTime="2026-04-16 18:02:20.435637953 +0000 UTC m=+20.713051858" watchObservedRunningTime="2026-04-16 18:02:20.450212095 +0000 UTC m=+20.727625986" Apr 16 18:02:20.450443 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:20.450421 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-w2x27" podStartSLOduration=3.495169234 podStartE2EDuration="20.450416086s" podCreationTimestamp="2026-04-16 18:02:00 +0000 UTC" firstStartedPulling="2026-04-16 18:02:03.070786736 +0000 UTC m=+3.348200619" lastFinishedPulling="2026-04-16 18:02:20.026033596 +0000 UTC m=+20.303447471" observedRunningTime="2026-04-16 18:02:20.450353511 +0000 UTC m=+20.727767403" watchObservedRunningTime="2026-04-16 18:02:20.450416086 +0000 UTC m=+20.727829980" Apr 16 18:02:20.464925 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:20.464879 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2nlc5" podStartSLOduration=3.736747639 podStartE2EDuration="20.464865276s" podCreationTimestamp="2026-04-16 18:02:00 +0000 UTC" firstStartedPulling="2026-04-16 18:02:03.070792386 +0000 UTC m=+3.348206262" lastFinishedPulling="2026-04-16 18:02:19.798910022 +0000 UTC m=+20.076323899" observedRunningTime="2026-04-16 18:02:20.464247838 +0000 UTC m=+20.741661732" watchObservedRunningTime="2026-04-16 18:02:20.464865276 +0000 UTC m=+20.742279170" Apr 16 18:02:21.291480 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:21.291311 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:21.292139 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:21.291552 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtd4f" podUID="5b5f0a92-12a8-4b64-989f-c8db774a38c1" Apr 16 18:02:21.308702 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:21.308671 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:02:21.378984 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:21.378950 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-v9kdc" event={"ID":"64cbfc80-dc30-49cf-a0eb-68130da967eb","Type":"ContainerStarted","Data":"5f80329d1216b65d2b73a799f561cb8ebdf3c0f7bb9d2d77ff57321a1fcc7e8b"} Apr 16 18:02:21.381345 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:21.381319 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" event={"ID":"298a93d5-3bc0-4a9d-9dd0-922e60e48669","Type":"ContainerStarted","Data":"01bf31ff0a4669cb34cf0c21d7a5edbf3d5632683e735c76617741a9da2e62e9"} Apr 16 18:02:21.381444 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:21.381353 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" event={"ID":"298a93d5-3bc0-4a9d-9dd0-922e60e48669","Type":"ContainerStarted","Data":"fbcc315b578d6277a1a7f5455ae643ca3435b7940082df1b3a8b4dee17c269cb"} Apr 16 18:02:21.381444 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:21.381365 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" event={"ID":"298a93d5-3bc0-4a9d-9dd0-922e60e48669","Type":"ContainerStarted","Data":"746522dfba0caa95006943678c3b9fd4ad8c8bc2a09ea7c7cb59bee1d836d7af"} Apr 16 18:02:21.381444 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:21.381392 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" event={"ID":"298a93d5-3bc0-4a9d-9dd0-922e60e48669","Type":"ContainerStarted","Data":"e53845324c38373ba2b07284e9e2591d3eaaaeb5357177b6d7af98962f7fd2ed"} Apr 16 18:02:21.381444 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:21.381402 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" event={"ID":"298a93d5-3bc0-4a9d-9dd0-922e60e48669","Type":"ContainerStarted","Data":"f63f4ba4473d02ecd8f181b7ace9b861a738a91e4aa213597479811aad55aa37"} Apr 16 18:02:21.381444 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:21.381410 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" event={"ID":"298a93d5-3bc0-4a9d-9dd0-922e60e48669","Type":"ContainerStarted","Data":"a52a63e7667fa92437cc5e351540c95f1695fb0eadd993220a04069c86ee427a"} Apr 16 18:02:21.382606 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:21.382585 2574 generic.go:358] "Generic (PLEG): container finished" podID="1e22989e-67d8-41f9-acfa-874e304428b5" containerID="131afbb1ba3cd0d8549436c1db2fca0df63718774abc1a7ecb36bd2ee1f40222" exitCode=0 Apr 16 18:02:21.382687 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:21.382649 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-frw6p" event={"ID":"1e22989e-67d8-41f9-acfa-874e304428b5","Type":"ContainerDied","Data":"131afbb1ba3cd0d8549436c1db2fca0df63718774abc1a7ecb36bd2ee1f40222"} Apr 16 18:02:21.384240 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:21.384212 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" event={"ID":"c3e86810-7277-49b0-aa54-682451b6950a","Type":"ContainerStarted","Data":"30f1ce449ff7a9d822840a97d3a41f7057c4d147168217801140d6dfe2be737b"} Apr 16 18:02:21.392866 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:21.392824 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-v9kdc" podStartSLOduration=4.448594639 podStartE2EDuration="21.392813476s" podCreationTimestamp="2026-04-16 18:02:00 +0000 UTC" firstStartedPulling="2026-04-16 18:02:03.073265106 +0000 UTC m=+3.350678990" lastFinishedPulling="2026-04-16 18:02:20.017483954 +0000 UTC m=+20.294897827" observedRunningTime="2026-04-16 18:02:21.392710779 +0000 UTC m=+21.670124686" watchObservedRunningTime="2026-04-16 18:02:21.392813476 +0000 UTC m=+21.670227407" Apr 16 18:02:22.221287 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:22.221148 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:02:21.308698641Z","UUID":"e31ffe45-570d-4c68-b812-5c7c27970ed9","Handler":null,"Name":"","Endpoint":""} Apr 16 18:02:22.223246 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:22.223195 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:02:22.223246 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:22.223235 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:02:22.290702 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:22.290678 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:22.290823 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:22.290802 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkqz7" podUID="299ac103-77d7-4ae3-b981-8c66d39e67eb" Apr 16 18:02:22.291067 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:22.291052 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:22.291170 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:22.291154 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pqv" podUID="9d69a3dd-c0fd-4764-b3d8-802189b16640" Apr 16 18:02:22.387671 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:22.387613 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" event={"ID":"c3e86810-7277-49b0-aa54-682451b6950a","Type":"ContainerStarted","Data":"be9f2f0f384de25615fa4b92c736becb90a90863d5a8ed11aea3e4c08b796a5e"} Apr 16 18:02:23.290823 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:23.290627 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:23.290951 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:23.290855 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtd4f" podUID="5b5f0a92-12a8-4b64-989f-c8db774a38c1" Apr 16 18:02:23.392804 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:23.392770 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" event={"ID":"298a93d5-3bc0-4a9d-9dd0-922e60e48669","Type":"ContainerStarted","Data":"23f56fbbe0eeff1864095e46524bd8a03ed247e50a9193f1d0b212284e021968"} Apr 16 18:02:23.689929 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:23.689893 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-5h65c" Apr 16 18:02:23.690587 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:23.690565 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-5h65c" Apr 16 18:02:23.706989 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:23.706943 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k9l7q" podStartSLOduration=4.728234467 podStartE2EDuration="23.706926292s" podCreationTimestamp="2026-04-16 18:02:00 +0000 UTC" firstStartedPulling="2026-04-16 18:02:03.074535393 +0000 UTC m=+3.351949266" lastFinishedPulling="2026-04-16 18:02:22.053227219 +0000 UTC m=+22.330641091" observedRunningTime="2026-04-16 18:02:22.407594157 +0000 UTC m=+22.685008052" watchObservedRunningTime="2026-04-16 18:02:23.706926292 +0000 UTC m=+23.984340188" Apr 16 18:02:23.745510 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:23.745477 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret\") pod \"global-pull-secret-syncer-gtd4f\" (UID: \"5b5f0a92-12a8-4b64-989f-c8db774a38c1\") " pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:23.745659 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:23.745614 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:23.745727 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:23.745692 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret podName:5b5f0a92-12a8-4b64-989f-c8db774a38c1 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:31.745672222 +0000 UTC m=+32.023086098 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret") pod "global-pull-secret-syncer-gtd4f" (UID: "5b5f0a92-12a8-4b64-989f-c8db774a38c1") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:24.290796 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:24.290762 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:24.290796 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:24.290785 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:24.291028 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:24.290915 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pqv" podUID="9d69a3dd-c0fd-4764-b3d8-802189b16640" Apr 16 18:02:24.291028 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:24.291013 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkqz7" podUID="299ac103-77d7-4ae3-b981-8c66d39e67eb" Apr 16 18:02:24.395056 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:24.395026 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-5h65c" Apr 16 18:02:24.395506 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:24.395458 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-5h65c" Apr 16 18:02:25.291138 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:25.291072 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:25.291281 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:25.291198 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtd4f" podUID="5b5f0a92-12a8-4b64-989f-c8db774a38c1" Apr 16 18:02:26.291050 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:26.291029 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:26.291749 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:26.291029 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:26.291749 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:26.291121 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pqv" podUID="9d69a3dd-c0fd-4764-b3d8-802189b16640" Apr 16 18:02:26.291749 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:26.291184 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkqz7" podUID="299ac103-77d7-4ae3-b981-8c66d39e67eb" Apr 16 18:02:26.401278 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:26.401147 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" event={"ID":"298a93d5-3bc0-4a9d-9dd0-922e60e48669","Type":"ContainerStarted","Data":"45d306bd80f2ec0d160104947a8b0f7ead8fbe5a0abc428e352c13643c18d188"} Apr 16 18:02:26.401598 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:26.401577 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:26.414530 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:26.414510 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:26.434595 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:26.434559 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" podStartSLOduration=9.014402612 podStartE2EDuration="26.434547819s" podCreationTimestamp="2026-04-16 18:02:00 +0000 UTC" firstStartedPulling="2026-04-16 18:02:03.072395372 +0000 UTC m=+3.349809258" lastFinishedPulling="2026-04-16 18:02:20.492540593 +0000 UTC m=+20.769954465" observedRunningTime="2026-04-16 18:02:26.434332914 +0000 UTC m=+26.711746808" watchObservedRunningTime="2026-04-16 18:02:26.434547819 +0000 UTC m=+26.711961749" Apr 16 18:02:27.290569 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:27.290530 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:27.290722 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:27.290675 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtd4f" podUID="5b5f0a92-12a8-4b64-989f-c8db774a38c1" Apr 16 18:02:27.405148 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:27.405105 2574 generic.go:358] "Generic (PLEG): container finished" podID="1e22989e-67d8-41f9-acfa-874e304428b5" containerID="f3c6d5cef821d207dded7bc1e004003f9e62dfef971ad862697f68b1e18d4e08" exitCode=0 Apr 16 18:02:27.406939 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:27.406906 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-frw6p" event={"ID":"1e22989e-67d8-41f9-acfa-874e304428b5","Type":"ContainerDied","Data":"f3c6d5cef821d207dded7bc1e004003f9e62dfef971ad862697f68b1e18d4e08"} Apr 16 18:02:27.407054 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:27.406954 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:27.407219 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:27.407202 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:27.423896 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:27.423873 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:02:27.992862 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:27.992829 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tkqz7"] Apr 16 18:02:27.993046 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:27.992975 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:27.993100 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:27.993071 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkqz7" podUID="299ac103-77d7-4ae3-b981-8c66d39e67eb" Apr 16 18:02:27.996088 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:27.996065 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gtd4f"] Apr 16 18:02:27.996222 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:27.996164 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:27.996278 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:27.996257 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtd4f" podUID="5b5f0a92-12a8-4b64-989f-c8db774a38c1" Apr 16 18:02:27.996893 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:27.996868 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-24pqv"] Apr 16 18:02:27.997005 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:27.996987 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:27.997093 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:27.997072 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pqv" podUID="9d69a3dd-c0fd-4764-b3d8-802189b16640" Apr 16 18:02:28.409098 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:28.409062 2574 generic.go:358] "Generic (PLEG): container finished" podID="1e22989e-67d8-41f9-acfa-874e304428b5" containerID="df8b1d2c059bd7aee7185d924556639d6e537b2335f0ebbe11c158f15236cd86" exitCode=0 Apr 16 18:02:28.409435 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:28.409145 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-frw6p" event={"ID":"1e22989e-67d8-41f9-acfa-874e304428b5","Type":"ContainerDied","Data":"df8b1d2c059bd7aee7185d924556639d6e537b2335f0ebbe11c158f15236cd86"} Apr 16 18:02:29.290573 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:29.290517 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:29.290676 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:29.290620 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtd4f" podUID="5b5f0a92-12a8-4b64-989f-c8db774a38c1" Apr 16 18:02:29.413630 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:29.413599 2574 generic.go:358] "Generic (PLEG): container finished" podID="1e22989e-67d8-41f9-acfa-874e304428b5" containerID="ce7b2495d09f43ce50605bcb106ddfd7f2041b21e34fc52c8f4868a52341ce6b" exitCode=0 Apr 16 18:02:29.414028 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:29.413692 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-frw6p" event={"ID":"1e22989e-67d8-41f9-acfa-874e304428b5","Type":"ContainerDied","Data":"ce7b2495d09f43ce50605bcb106ddfd7f2041b21e34fc52c8f4868a52341ce6b"} Apr 16 18:02:30.291555 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:30.291341 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:30.291725 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:30.291631 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkqz7" podUID="299ac103-77d7-4ae3-b981-8c66d39e67eb" Apr 16 18:02:30.291799 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:30.291430 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:30.291889 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:30.291868 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pqv" podUID="9d69a3dd-c0fd-4764-b3d8-802189b16640" Apr 16 18:02:31.291012 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:31.290981 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:31.291457 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:31.291112 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtd4f" podUID="5b5f0a92-12a8-4b64-989f-c8db774a38c1" Apr 16 18:02:31.808939 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:31.808899 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret\") pod \"global-pull-secret-syncer-gtd4f\" (UID: \"5b5f0a92-12a8-4b64-989f-c8db774a38c1\") " pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:31.809213 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:31.809137 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:31.809272 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:31.809267 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret podName:5b5f0a92-12a8-4b64-989f-c8db774a38c1 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:47.809246343 +0000 UTC m=+48.086660214 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret") pod "global-pull-secret-syncer-gtd4f" (UID: "5b5f0a92-12a8-4b64-989f-c8db774a38c1") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:32.291440 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:32.291412 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:32.291894 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:32.291551 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pqv" podUID="9d69a3dd-c0fd-4764-b3d8-802189b16640" Apr 16 18:02:32.291894 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:32.291583 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:32.291894 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:32.291645 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkqz7" podUID="299ac103-77d7-4ae3-b981-8c66d39e67eb" Apr 16 18:02:33.033523 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.033490 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeReady" Apr 16 18:02:33.033680 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.033634 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:02:33.093193 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.093113 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-btqpr"] Apr 16 18:02:33.123773 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.123750 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-z9628"] Apr 16 18:02:33.123933 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.123918 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-btqpr" Apr 16 18:02:33.126603 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.126582 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:02:33.126759 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.126609 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lh6dc\"" Apr 16 18:02:33.126997 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.126977 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:02:33.147418 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.147393 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-btqpr"] Apr 16 18:02:33.147418 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.147422 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z9628"] Apr 16 18:02:33.147579 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.147474 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z9628" Apr 16 18:02:33.157129 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.157110 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:02:33.157227 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.157132 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:02:33.157636 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.157616 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:02:33.158280 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.158221 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-p7xs6\"" Apr 16 18:02:33.222214 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.222188 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/567b1659-cf99-4ad6-aad7-99460710d869-tmp-dir\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:02:33.222331 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.222247 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfps7\" (UniqueName: \"kubernetes.io/projected/567b1659-cf99-4ad6-aad7-99460710d869-kube-api-access-kfps7\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:02:33.222406 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.222326 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:02:33.222406 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.222388 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/567b1659-cf99-4ad6-aad7-99460710d869-config-volume\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:02:33.291534 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.291513 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:33.295044 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.295024 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:02:33.323718 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.323689 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfps7\" (UniqueName: \"kubernetes.io/projected/567b1659-cf99-4ad6-aad7-99460710d869-kube-api-access-kfps7\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:02:33.323795 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.323740 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:02:33.323863 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:33.323832 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:33.323912 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:33.323892 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls podName:567b1659-cf99-4ad6-aad7-99460710d869 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:33.823873069 +0000 UTC m=+34.101286942 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls") pod "dns-default-btqpr" (UID: "567b1659-cf99-4ad6-aad7-99460710d869") : secret "dns-default-metrics-tls" not found Apr 16 18:02:33.323977 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.323922 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/567b1659-cf99-4ad6-aad7-99460710d869-config-volume\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:02:33.324031 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.323996 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert\") pod \"ingress-canary-z9628\" (UID: \"6e8ad75a-91e2-49d1-b5b3-8f18c05f867d\") " pod="openshift-ingress-canary/ingress-canary-z9628" Apr 16 18:02:33.324081 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.324033 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/567b1659-cf99-4ad6-aad7-99460710d869-tmp-dir\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:02:33.324081 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.324068 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z84qk\" (UniqueName: \"kubernetes.io/projected/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-kube-api-access-z84qk\") pod \"ingress-canary-z9628\" (UID: \"6e8ad75a-91e2-49d1-b5b3-8f18c05f867d\") " pod="openshift-ingress-canary/ingress-canary-z9628" Apr 16 18:02:33.324307 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.324293 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/567b1659-cf99-4ad6-aad7-99460710d869-tmp-dir\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:02:33.324431 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.324414 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/567b1659-cf99-4ad6-aad7-99460710d869-config-volume\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:02:33.343070 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.343054 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfps7\" (UniqueName: \"kubernetes.io/projected/567b1659-cf99-4ad6-aad7-99460710d869-kube-api-access-kfps7\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:02:33.425303 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.425273 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z84qk\" (UniqueName: \"kubernetes.io/projected/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-kube-api-access-z84qk\") pod \"ingress-canary-z9628\" (UID: \"6e8ad75a-91e2-49d1-b5b3-8f18c05f867d\") " pod="openshift-ingress-canary/ingress-canary-z9628" Apr 16 18:02:33.425460 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.425411 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert\") pod \"ingress-canary-z9628\" (UID: \"6e8ad75a-91e2-49d1-b5b3-8f18c05f867d\") " pod="openshift-ingress-canary/ingress-canary-z9628" Apr 16 18:02:33.425570 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:33.425552 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:33.425636 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:33.425623 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert podName:6e8ad75a-91e2-49d1-b5b3-8f18c05f867d nodeName:}" failed. No retries permitted until 2026-04-16 18:02:33.925601615 +0000 UTC m=+34.203015499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert") pod "ingress-canary-z9628" (UID: "6e8ad75a-91e2-49d1-b5b3-8f18c05f867d") : secret "canary-serving-cert" not found Apr 16 18:02:33.448678 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.448650 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z84qk\" (UniqueName: \"kubernetes.io/projected/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-kube-api-access-z84qk\") pod \"ingress-canary-z9628\" (UID: \"6e8ad75a-91e2-49d1-b5b3-8f18c05f867d\") " pod="openshift-ingress-canary/ingress-canary-z9628" Apr 16 18:02:33.829405 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.829310 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:02:33.829578 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:33.829466 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:33.829578 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:33.829538 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls podName:567b1659-cf99-4ad6-aad7-99460710d869 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:34.829518487 +0000 UTC m=+35.106932362 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls") pod "dns-default-btqpr" (UID: "567b1659-cf99-4ad6-aad7-99460710d869") : secret "dns-default-metrics-tls" not found Apr 16 18:02:33.930238 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.930207 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert\") pod \"ingress-canary-z9628\" (UID: \"6e8ad75a-91e2-49d1-b5b3-8f18c05f867d\") " pod="openshift-ingress-canary/ingress-canary-z9628" Apr 16 18:02:33.930426 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:33.930276 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs\") pod \"network-metrics-daemon-24pqv\" (UID: \"9d69a3dd-c0fd-4764-b3d8-802189b16640\") " pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:33.930426 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:33.930379 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:33.930426 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:33.930399 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:33.930589 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:33.930456 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert podName:6e8ad75a-91e2-49d1-b5b3-8f18c05f867d nodeName:}" failed. No retries permitted until 2026-04-16 18:02:34.930436081 +0000 UTC m=+35.207850152 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert") pod "ingress-canary-z9628" (UID: "6e8ad75a-91e2-49d1-b5b3-8f18c05f867d") : secret "canary-serving-cert" not found Apr 16 18:02:33.930589 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:33.930480 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs podName:9d69a3dd-c0fd-4764-b3d8-802189b16640 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:05.930470392 +0000 UTC m=+66.207884265 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs") pod "network-metrics-daemon-24pqv" (UID: "9d69a3dd-c0fd-4764-b3d8-802189b16640") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:34.131683 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:34.131601 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grbkn\" (UniqueName: \"kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn\") pod \"network-check-target-tkqz7\" (UID: \"299ac103-77d7-4ae3-b981-8c66d39e67eb\") " pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:34.131877 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:34.131771 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:34.131877 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:34.131795 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:34.131877 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:34.131809 2574 projected.go:194] Error preparing data for projected volume kube-api-access-grbkn for pod openshift-network-diagnostics/network-check-target-tkqz7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:34.131877 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:34.131871 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn podName:299ac103-77d7-4ae3-b981-8c66d39e67eb nodeName:}" failed. No retries permitted until 2026-04-16 18:03:06.131854671 +0000 UTC m=+66.409268557 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-grbkn" (UniqueName: "kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn") pod "network-check-target-tkqz7" (UID: "299ac103-77d7-4ae3-b981-8c66d39e67eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:34.291564 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:34.291530 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:02:34.291564 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:34.291555 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:02:34.298278 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:34.298255 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:02:34.298523 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:34.298328 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:02:34.298523 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:34.298504 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:02:34.298738 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:34.298709 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2dnp6\"" Apr 16 18:02:34.298799 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:34.298773 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-j756c\"" Apr 16 18:02:34.836087 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:34.836050 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:02:34.836347 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:34.836157 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:34.836347 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:34.836224 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls podName:567b1659-cf99-4ad6-aad7-99460710d869 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:36.836208383 +0000 UTC m=+37.113622256 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls") pod "dns-default-btqpr" (UID: "567b1659-cf99-4ad6-aad7-99460710d869") : secret "dns-default-metrics-tls" not found Apr 16 18:02:34.937174 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:34.937139 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert\") pod \"ingress-canary-z9628\" (UID: \"6e8ad75a-91e2-49d1-b5b3-8f18c05f867d\") " pod="openshift-ingress-canary/ingress-canary-z9628" Apr 16 18:02:34.937347 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:34.937326 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:34.937441 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:34.937427 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert podName:6e8ad75a-91e2-49d1-b5b3-8f18c05f867d nodeName:}" failed. No retries permitted until 2026-04-16 18:02:36.937405893 +0000 UTC m=+37.214819768 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert") pod "ingress-canary-z9628" (UID: "6e8ad75a-91e2-49d1-b5b3-8f18c05f867d") : secret "canary-serving-cert" not found Apr 16 18:02:36.426641 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:36.426608 2574 generic.go:358] "Generic (PLEG): container finished" podID="1e22989e-67d8-41f9-acfa-874e304428b5" containerID="a47c36c4da1003830b68db00f74144561d537614f8e9f42ac8f1e5814be19a73" exitCode=0 Apr 16 18:02:36.427054 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:36.426685 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-frw6p" event={"ID":"1e22989e-67d8-41f9-acfa-874e304428b5","Type":"ContainerDied","Data":"a47c36c4da1003830b68db00f74144561d537614f8e9f42ac8f1e5814be19a73"} Apr 16 18:02:36.852105 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:36.852050 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:02:36.852206 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:36.852189 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:36.852259 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:36.852251 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls podName:567b1659-cf99-4ad6-aad7-99460710d869 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:40.852234324 +0000 UTC m=+41.129648195 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls") pod "dns-default-btqpr" (UID: "567b1659-cf99-4ad6-aad7-99460710d869") : secret "dns-default-metrics-tls" not found Apr 16 18:02:36.953319 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:36.953295 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert\") pod \"ingress-canary-z9628\" (UID: \"6e8ad75a-91e2-49d1-b5b3-8f18c05f867d\") " pod="openshift-ingress-canary/ingress-canary-z9628" Apr 16 18:02:36.953434 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:36.953421 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:36.953477 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:36.953469 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert podName:6e8ad75a-91e2-49d1-b5b3-8f18c05f867d nodeName:}" failed. No retries permitted until 2026-04-16 18:02:40.953457233 +0000 UTC m=+41.230871105 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert") pod "ingress-canary-z9628" (UID: "6e8ad75a-91e2-49d1-b5b3-8f18c05f867d") : secret "canary-serving-cert" not found Apr 16 18:02:37.430700 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:37.430674 2574 generic.go:358] "Generic (PLEG): container finished" podID="1e22989e-67d8-41f9-acfa-874e304428b5" containerID="fed9e643d24d91f7beb4285d2db640766e4cb6e821aa5a47b13be1768b4c4947" exitCode=0 Apr 16 18:02:37.430999 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:37.430717 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-frw6p" event={"ID":"1e22989e-67d8-41f9-acfa-874e304428b5","Type":"ContainerDied","Data":"fed9e643d24d91f7beb4285d2db640766e4cb6e821aa5a47b13be1768b4c4947"} Apr 16 18:02:38.436061 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:38.435852 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-frw6p" event={"ID":"1e22989e-67d8-41f9-acfa-874e304428b5","Type":"ContainerStarted","Data":"41eb97c31449578a66c2ef50439106141afa9b426f278f6716749f0ed6a58ffa"} Apr 16 18:02:38.459610 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:38.459561 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-frw6p" podStartSLOduration=6.120618602 podStartE2EDuration="38.459546437s" podCreationTimestamp="2026-04-16 18:02:00 +0000 UTC" firstStartedPulling="2026-04-16 18:02:03.075820698 +0000 UTC m=+3.353234581" lastFinishedPulling="2026-04-16 18:02:35.41474853 +0000 UTC m=+35.692162416" observedRunningTime="2026-04-16 18:02:38.458920318 +0000 UTC m=+38.736334212" watchObservedRunningTime="2026-04-16 18:02:38.459546437 +0000 UTC m=+38.736960331" Apr 16 18:02:40.880212 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:40.880169 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:02:40.880695 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:40.880312 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:40.880695 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:40.880401 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls podName:567b1659-cf99-4ad6-aad7-99460710d869 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:48.880365442 +0000 UTC m=+49.157779314 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls") pod "dns-default-btqpr" (UID: "567b1659-cf99-4ad6-aad7-99460710d869") : secret "dns-default-metrics-tls" not found Apr 16 18:02:40.980882 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:40.980853 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert\") pod \"ingress-canary-z9628\" (UID: \"6e8ad75a-91e2-49d1-b5b3-8f18c05f867d\") " pod="openshift-ingress-canary/ingress-canary-z9628" Apr 16 18:02:40.981015 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:40.980956 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:40.981015 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:40.981007 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert podName:6e8ad75a-91e2-49d1-b5b3-8f18c05f867d nodeName:}" failed. No retries permitted until 2026-04-16 18:02:48.980992789 +0000 UTC m=+49.258406661 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert") pod "ingress-canary-z9628" (UID: "6e8ad75a-91e2-49d1-b5b3-8f18c05f867d") : secret "canary-serving-cert" not found Apr 16 18:02:47.827467 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:47.827418 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret\") pod \"global-pull-secret-syncer-gtd4f\" (UID: \"5b5f0a92-12a8-4b64-989f-c8db774a38c1\") " pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:47.830621 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:47.830599 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b5f0a92-12a8-4b64-989f-c8db774a38c1-original-pull-secret\") pod \"global-pull-secret-syncer-gtd4f\" (UID: \"5b5f0a92-12a8-4b64-989f-c8db774a38c1\") " pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:48.001640 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:48.001614 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtd4f" Apr 16 18:02:48.163791 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:48.163768 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gtd4f"] Apr 16 18:02:48.456211 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:48.456176 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gtd4f" event={"ID":"5b5f0a92-12a8-4b64-989f-c8db774a38c1","Type":"ContainerStarted","Data":"917ce7b6fb116dd689d6b0da9644e646934953e73aabd883b395d90277ee3227"} Apr 16 18:02:48.934008 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:48.933977 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:02:48.934412 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:48.934132 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:48.934412 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:48.934208 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls podName:567b1659-cf99-4ad6-aad7-99460710d869 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:04.934187254 +0000 UTC m=+65.211601140 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls") pod "dns-default-btqpr" (UID: "567b1659-cf99-4ad6-aad7-99460710d869") : secret "dns-default-metrics-tls" not found Apr 16 18:02:49.034601 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:49.034570 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert\") pod \"ingress-canary-z9628\" (UID: \"6e8ad75a-91e2-49d1-b5b3-8f18c05f867d\") " pod="openshift-ingress-canary/ingress-canary-z9628" Apr 16 18:02:49.034735 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:49.034722 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:49.034815 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:02:49.034803 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert podName:6e8ad75a-91e2-49d1-b5b3-8f18c05f867d nodeName:}" failed. No retries permitted until 2026-04-16 18:03:05.034784011 +0000 UTC m=+65.312197895 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert") pod "ingress-canary-z9628" (UID: "6e8ad75a-91e2-49d1-b5b3-8f18c05f867d") : secret "canary-serving-cert" not found Apr 16 18:02:52.466212 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:52.466177 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gtd4f" event={"ID":"5b5f0a92-12a8-4b64-989f-c8db774a38c1","Type":"ContainerStarted","Data":"09dec3dfc183440e1f3224ab00bdd61dcf6cea1a20141383291f4b175322afe8"} Apr 16 18:02:52.482925 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:52.482883 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gtd4f" podStartSLOduration=33.41435063 podStartE2EDuration="37.482866956s" podCreationTimestamp="2026-04-16 18:02:15 +0000 UTC" firstStartedPulling="2026-04-16 18:02:48.177757374 +0000 UTC m=+48.455171245" lastFinishedPulling="2026-04-16 18:02:52.246273696 +0000 UTC m=+52.523687571" observedRunningTime="2026-04-16 18:02:52.482510437 +0000 UTC m=+52.759924330" watchObservedRunningTime="2026-04-16 18:02:52.482866956 +0000 UTC m=+52.760280859" Apr 16 18:02:53.622579 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:53.622539 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9"] Apr 16 18:02:53.625533 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:53.625512 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9" Apr 16 18:02:53.628140 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:53.628111 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 18:02:53.628758 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:53.628618 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 18:02:53.628758 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:53.628653 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 18:02:53.629006 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:53.628943 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 18:02:53.629182 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:53.629161 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-znfdm\"" Apr 16 18:02:53.634218 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:53.634196 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9"] Apr 16 18:02:53.668707 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:53.668678 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzfpw\" (UniqueName: \"kubernetes.io/projected/665ebed3-4568-4fa2-8182-61d33ca28db9-kube-api-access-gzfpw\") pod \"managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9\" (UID: \"665ebed3-4568-4fa2-8182-61d33ca28db9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9" Apr 16 18:02:53.668816 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:53.668728 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/665ebed3-4568-4fa2-8182-61d33ca28db9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9\" (UID: \"665ebed3-4568-4fa2-8182-61d33ca28db9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9" Apr 16 18:02:53.769151 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:53.769127 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzfpw\" (UniqueName: \"kubernetes.io/projected/665ebed3-4568-4fa2-8182-61d33ca28db9-kube-api-access-gzfpw\") pod \"managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9\" (UID: \"665ebed3-4568-4fa2-8182-61d33ca28db9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9" Apr 16 18:02:53.769258 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:53.769165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/665ebed3-4568-4fa2-8182-61d33ca28db9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9\" (UID: \"665ebed3-4568-4fa2-8182-61d33ca28db9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9" Apr 16 18:02:53.771776 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:53.771757 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/665ebed3-4568-4fa2-8182-61d33ca28db9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9\" (UID: \"665ebed3-4568-4fa2-8182-61d33ca28db9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9" Apr 16 18:02:53.777989 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:53.777962 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzfpw\" (UniqueName: \"kubernetes.io/projected/665ebed3-4568-4fa2-8182-61d33ca28db9-kube-api-access-gzfpw\") pod \"managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9\" (UID: \"665ebed3-4568-4fa2-8182-61d33ca28db9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9" Apr 16 18:02:53.945048 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:53.945028 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9" Apr 16 18:02:54.083894 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:54.083838 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9"] Apr 16 18:02:54.086353 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:02:54.086324 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod665ebed3_4568_4fa2_8182_61d33ca28db9.slice/crio-571ee84a34cef03742ba1d50a9785a80a6dd3f020771e495d6b6cf29649f907e WatchSource:0}: Error finding container 571ee84a34cef03742ba1d50a9785a80a6dd3f020771e495d6b6cf29649f907e: Status 404 returned error can't find the container with id 571ee84a34cef03742ba1d50a9785a80a6dd3f020771e495d6b6cf29649f907e Apr 16 18:02:54.471715 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:54.471680 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9" event={"ID":"665ebed3-4568-4fa2-8182-61d33ca28db9","Type":"ContainerStarted","Data":"571ee84a34cef03742ba1d50a9785a80a6dd3f020771e495d6b6cf29649f907e"} Apr 16 18:02:57.477621 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:57.477591 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9" event={"ID":"665ebed3-4568-4fa2-8182-61d33ca28db9","Type":"ContainerStarted","Data":"65f80bc0dc8fd3894595bac0c1c37d7dcabacbed740774f7c14412c258c26da0"} Apr 16 18:02:57.492301 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:57.492238 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9" podStartSLOduration=2.065915975 podStartE2EDuration="4.492224491s" podCreationTimestamp="2026-04-16 18:02:53 +0000 UTC" firstStartedPulling="2026-04-16 18:02:54.088248418 +0000 UTC m=+54.365662303" lastFinishedPulling="2026-04-16 18:02:56.514556946 +0000 UTC m=+56.791970819" observedRunningTime="2026-04-16 18:02:57.491549082 +0000 UTC m=+57.768962976" watchObservedRunningTime="2026-04-16 18:02:57.492224491 +0000 UTC m=+57.769638397" Apr 16 18:02:59.424922 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:02:59.424884 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mv8js" Apr 16 18:03:04.952164 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:04.952131 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:03:04.952556 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:03:04.952252 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:04.952556 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:03:04.952301 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls podName:567b1659-cf99-4ad6-aad7-99460710d869 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:36.95228827 +0000 UTC m=+97.229702141 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls") pod "dns-default-btqpr" (UID: "567b1659-cf99-4ad6-aad7-99460710d869") : secret "dns-default-metrics-tls" not found Apr 16 18:03:05.052816 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:05.052789 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert\") pod \"ingress-canary-z9628\" (UID: \"6e8ad75a-91e2-49d1-b5b3-8f18c05f867d\") " pod="openshift-ingress-canary/ingress-canary-z9628" Apr 16 18:03:05.052917 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:03:05.052878 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:05.052955 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:03:05.052922 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert podName:6e8ad75a-91e2-49d1-b5b3-8f18c05f867d nodeName:}" failed. No retries permitted until 2026-04-16 18:03:37.052909318 +0000 UTC m=+97.330323189 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert") pod "ingress-canary-z9628" (UID: "6e8ad75a-91e2-49d1-b5b3-8f18c05f867d") : secret "canary-serving-cert" not found Apr 16 18:03:05.958286 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:05.958251 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs\") pod \"network-metrics-daemon-24pqv\" (UID: \"9d69a3dd-c0fd-4764-b3d8-802189b16640\") " pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:03:05.960661 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:05.960641 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:03:05.968542 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:03:05.968524 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:03:05.968641 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:03:05.968588 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs podName:9d69a3dd-c0fd-4764-b3d8-802189b16640 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:09.968569118 +0000 UTC m=+130.245982990 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs") pod "network-metrics-daemon-24pqv" (UID: "9d69a3dd-c0fd-4764-b3d8-802189b16640") : secret "metrics-daemon-secret" not found Apr 16 18:03:06.160219 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:06.160186 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grbkn\" (UniqueName: \"kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn\") pod \"network-check-target-tkqz7\" (UID: \"299ac103-77d7-4ae3-b981-8c66d39e67eb\") " pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:03:06.162679 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:06.162662 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:03:06.173393 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:06.173357 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:03:06.184579 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:06.184551 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grbkn\" (UniqueName: \"kubernetes.io/projected/299ac103-77d7-4ae3-b981-8c66d39e67eb-kube-api-access-grbkn\") pod \"network-check-target-tkqz7\" (UID: \"299ac103-77d7-4ae3-b981-8c66d39e67eb\") " pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:03:06.407494 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:06.407441 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-j756c\"" Apr 16 18:03:06.415275 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:06.415257 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:03:06.524826 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:06.524775 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tkqz7"] Apr 16 18:03:06.528579 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:03:06.528541 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod299ac103_77d7_4ae3_b981_8c66d39e67eb.slice/crio-3399244911a111645a910d22cf7ffde919beb59007f6e4de0fac5fc509c80949 WatchSource:0}: Error finding container 3399244911a111645a910d22cf7ffde919beb59007f6e4de0fac5fc509c80949: Status 404 returned error can't find the container with id 3399244911a111645a910d22cf7ffde919beb59007f6e4de0fac5fc509c80949 Apr 16 18:03:07.497380 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:07.497336 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tkqz7" event={"ID":"299ac103-77d7-4ae3-b981-8c66d39e67eb","Type":"ContainerStarted","Data":"3399244911a111645a910d22cf7ffde919beb59007f6e4de0fac5fc509c80949"} Apr 16 18:03:09.502361 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:09.502329 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tkqz7" event={"ID":"299ac103-77d7-4ae3-b981-8c66d39e67eb","Type":"ContainerStarted","Data":"ef3d66bbb395427374a306b418c65422a5152c40ac1d53775215066177eb4e73"} Apr 16 18:03:09.502746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:09.502458 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:03:09.519641 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:09.519600 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-tkqz7" podStartSLOduration=66.886184758 podStartE2EDuration="1m9.51958844s" podCreationTimestamp="2026-04-16 18:02:00 +0000 UTC" firstStartedPulling="2026-04-16 18:03:06.53030778 +0000 UTC m=+66.807721671" lastFinishedPulling="2026-04-16 18:03:09.163711469 +0000 UTC m=+69.441125353" observedRunningTime="2026-04-16 18:03:09.519551163 +0000 UTC m=+69.796965057" watchObservedRunningTime="2026-04-16 18:03:09.51958844 +0000 UTC m=+69.797002333" Apr 16 18:03:36.960031 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:36.959881 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:03:36.960031 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:03:36.960003 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:36.960599 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:03:36.960061 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls podName:567b1659-cf99-4ad6-aad7-99460710d869 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:40.960045322 +0000 UTC m=+161.237459194 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls") pod "dns-default-btqpr" (UID: "567b1659-cf99-4ad6-aad7-99460710d869") : secret "dns-default-metrics-tls" not found Apr 16 18:03:37.060290 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:37.060253 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert\") pod \"ingress-canary-z9628\" (UID: \"6e8ad75a-91e2-49d1-b5b3-8f18c05f867d\") " pod="openshift-ingress-canary/ingress-canary-z9628" Apr 16 18:03:37.060445 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:03:37.060424 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:37.060511 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:03:37.060499 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert podName:6e8ad75a-91e2-49d1-b5b3-8f18c05f867d nodeName:}" failed. No retries permitted until 2026-04-16 18:04:41.060481649 +0000 UTC m=+161.337895538 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert") pod "ingress-canary-z9628" (UID: "6e8ad75a-91e2-49d1-b5b3-8f18c05f867d") : secret "canary-serving-cert" not found Apr 16 18:03:40.506189 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:03:40.506163 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-tkqz7" Apr 16 18:04:09.976125 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:09.976080 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs\") pod \"network-metrics-daemon-24pqv\" (UID: \"9d69a3dd-c0fd-4764-b3d8-802189b16640\") " pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:04:09.976667 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:09.976225 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:04:09.976667 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:09.976290 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs podName:9d69a3dd-c0fd-4764-b3d8-802189b16640 nodeName:}" failed. No retries permitted until 2026-04-16 18:06:11.976271487 +0000 UTC m=+252.253685376 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs") pod "network-metrics-daemon-24pqv" (UID: "9d69a3dd-c0fd-4764-b3d8-802189b16640") : secret "metrics-daemon-secret" not found Apr 16 18:04:23.299992 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.299957 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-gssjt"] Apr 16 18:04:23.302769 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.302751 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:23.308193 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.308170 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 18:04:23.309251 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.309179 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 18:04:23.309916 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.309892 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:04:23.310492 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.310473 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-snk2j\"" Apr 16 18:04:23.310613 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.310588 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 18:04:23.326948 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.326924 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 18:04:23.366000 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.365973 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-gssjt"] Apr 16 18:04:23.405136 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.405116 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f"] Apr 16 18:04:23.408277 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.408258 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86"] Apr 16 18:04:23.408472 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.408457 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f" Apr 16 18:04:23.410970 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.410950 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9"] Apr 16 18:04:23.411079 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.411065 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86" Apr 16 18:04:23.412583 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.412568 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-sr4m9\"" Apr 16 18:04:23.413211 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.413194 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 18:04:23.413382 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.413215 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 18:04:23.413490 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.413223 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 18:04:23.413490 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.413392 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-9tm6d\"" Apr 16 18:04:23.413490 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.413273 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 18:04:23.413490 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.413284 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:04:23.413677 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.413607 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-bmn9v"] Apr 16 18:04:23.413767 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.413749 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 18:04:23.413812 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.413756 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" Apr 16 18:04:23.416493 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.416362 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-bmn9v" Apr 16 18:04:23.416493 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.416409 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-kggb4\"" Apr 16 18:04:23.416493 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.416427 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 18:04:23.416493 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.416468 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:04:23.416717 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.416427 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 18:04:23.416717 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.416692 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:04:23.431477 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.431438 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f"] Apr 16 18:04:23.432219 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.432178 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86"] Apr 16 18:04:23.433119 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.433096 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9"] Apr 16 18:04:23.434056 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.434030 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-bmn9v"] Apr 16 18:04:23.435013 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.434991 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-54c8fc9545-l6f7w"] Apr 16 18:04:23.437772 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.437751 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.458031 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.458006 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc84b07b-78de-4ae8-bb54-38e289deb5f1-config\") pod \"console-operator-d87b8d5fc-gssjt\" (UID: \"dc84b07b-78de-4ae8-bb54-38e289deb5f1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:23.458031 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.458032 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc84b07b-78de-4ae8-bb54-38e289deb5f1-serving-cert\") pod \"console-operator-d87b8d5fc-gssjt\" (UID: \"dc84b07b-78de-4ae8-bb54-38e289deb5f1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:23.458215 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.458071 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzfxj\" (UniqueName: \"kubernetes.io/projected/dc84b07b-78de-4ae8-bb54-38e289deb5f1-kube-api-access-dzfxj\") pod \"console-operator-d87b8d5fc-gssjt\" (UID: \"dc84b07b-78de-4ae8-bb54-38e289deb5f1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:23.458215 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.458175 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc84b07b-78de-4ae8-bb54-38e289deb5f1-trusted-ca\") pod \"console-operator-d87b8d5fc-gssjt\" (UID: \"dc84b07b-78de-4ae8-bb54-38e289deb5f1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:23.458215 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.458179 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:04:23.458215 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.458206 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:04:23.458409 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.458389 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hdkzb\"" Apr 16 18:04:23.465050 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.465037 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:04:23.470496 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.470482 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-pmlct\"" Apr 16 18:04:23.479001 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.478971 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54c8fc9545-l6f7w"] Apr 16 18:04:23.518053 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.518028 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:04:23.558999 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.558925 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmv6t\" (UniqueName: \"kubernetes.io/projected/d7f5c41a-d5cd-4717-b0c7-464ae63f0d81-kube-api-access-vmv6t\") pod \"network-check-source-7b678d77c7-bmn9v\" (UID: \"d7f5c41a-d5cd-4717-b0c7-464ae63f0d81\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-bmn9v" Apr 16 18:04:23.558999 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.558959 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-k7vp9\" (UID: \"8b0ce7ac-c20c-4958-8a1f-f646f39de6ab\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" Apr 16 18:04:23.558999 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.558989 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-trusted-ca\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.559259 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559020 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-kgn8f\" (UID: \"5e4633a3-2eb6-4ec0-af69-c94c4487082b\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f" Apr 16 18:04:23.559259 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559072 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc84b07b-78de-4ae8-bb54-38e289deb5f1-config\") pod \"console-operator-d87b8d5fc-gssjt\" (UID: \"dc84b07b-78de-4ae8-bb54-38e289deb5f1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:23.559259 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559124 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-k7vp9\" (UID: \"8b0ce7ac-c20c-4958-8a1f-f646f39de6ab\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" Apr 16 18:04:23.559259 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559151 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b5l5\" (UniqueName: \"kubernetes.io/projected/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-kube-api-access-6b5l5\") pod \"cluster-monitoring-operator-6667474d89-k7vp9\" (UID: \"8b0ce7ac-c20c-4958-8a1f-f646f39de6ab\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" Apr 16 18:04:23.559259 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559178 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-installation-pull-secrets\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.559259 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559209 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-bound-sa-token\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.559259 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559237 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-ca-trust-extracted\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.559529 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559267 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35a6661a-e671-4470-a717-3ca52c801a37-config\") pod \"service-ca-operator-69965bb79d-rwh86\" (UID: \"35a6661a-e671-4470-a717-3ca52c801a37\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86" Apr 16 18:04:23.559529 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559308 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-image-registry-private-configuration\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.559529 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559401 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35a6661a-e671-4470-a717-3ca52c801a37-serving-cert\") pod \"service-ca-operator-69965bb79d-rwh86\" (UID: \"35a6661a-e671-4470-a717-3ca52c801a37\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86" Apr 16 18:04:23.559529 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559439 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5e4633a3-2eb6-4ec0-af69-c94c4487082b-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-kgn8f\" (UID: \"5e4633a3-2eb6-4ec0-af69-c94c4487082b\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f" Apr 16 18:04:23.559529 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559476 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc84b07b-78de-4ae8-bb54-38e289deb5f1-trusted-ca\") pod \"console-operator-d87b8d5fc-gssjt\" (UID: \"dc84b07b-78de-4ae8-bb54-38e289deb5f1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:23.559529 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559495 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvwwq\" (UniqueName: \"kubernetes.io/projected/35a6661a-e671-4470-a717-3ca52c801a37-kube-api-access-cvwwq\") pod \"service-ca-operator-69965bb79d-rwh86\" (UID: \"35a6661a-e671-4470-a717-3ca52c801a37\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86" Apr 16 18:04:23.559815 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559555 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc84b07b-78de-4ae8-bb54-38e289deb5f1-serving-cert\") pod \"console-operator-d87b8d5fc-gssjt\" (UID: \"dc84b07b-78de-4ae8-bb54-38e289deb5f1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:23.559815 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559576 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-certificates\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.559815 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559619 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzfxj\" (UniqueName: \"kubernetes.io/projected/dc84b07b-78de-4ae8-bb54-38e289deb5f1-kube-api-access-dzfxj\") pod \"console-operator-d87b8d5fc-gssjt\" (UID: \"dc84b07b-78de-4ae8-bb54-38e289deb5f1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:23.559815 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559642 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.559815 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559705 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6xkm\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-kube-api-access-b6xkm\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.559815 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.559792 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc84b07b-78de-4ae8-bb54-38e289deb5f1-config\") pod \"console-operator-d87b8d5fc-gssjt\" (UID: \"dc84b07b-78de-4ae8-bb54-38e289deb5f1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:23.560406 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.560365 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc84b07b-78de-4ae8-bb54-38e289deb5f1-trusted-ca\") pod \"console-operator-d87b8d5fc-gssjt\" (UID: \"dc84b07b-78de-4ae8-bb54-38e289deb5f1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:23.561824 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.561807 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc84b07b-78de-4ae8-bb54-38e289deb5f1-serving-cert\") pod \"console-operator-d87b8d5fc-gssjt\" (UID: \"dc84b07b-78de-4ae8-bb54-38e289deb5f1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:23.570821 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.570799 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzfxj\" (UniqueName: \"kubernetes.io/projected/dc84b07b-78de-4ae8-bb54-38e289deb5f1-kube-api-access-dzfxj\") pod \"console-operator-d87b8d5fc-gssjt\" (UID: \"dc84b07b-78de-4ae8-bb54-38e289deb5f1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:23.612781 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.612756 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:23.660154 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660121 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-certificates\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.660283 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660178 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.660283 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660207 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6xkm\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-kube-api-access-b6xkm\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.660283 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660229 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmv6t\" (UniqueName: \"kubernetes.io/projected/d7f5c41a-d5cd-4717-b0c7-464ae63f0d81-kube-api-access-vmv6t\") pod \"network-check-source-7b678d77c7-bmn9v\" (UID: \"d7f5c41a-d5cd-4717-b0c7-464ae63f0d81\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-bmn9v" Apr 16 18:04:23.660283 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660257 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-k7vp9\" (UID: \"8b0ce7ac-c20c-4958-8a1f-f646f39de6ab\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" Apr 16 18:04:23.660518 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660285 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-trusted-ca\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.660518 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-kgn8f\" (UID: \"5e4633a3-2eb6-4ec0-af69-c94c4487082b\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f" Apr 16 18:04:23.660518 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660342 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-k7vp9\" (UID: \"8b0ce7ac-c20c-4958-8a1f-f646f39de6ab\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" Apr 16 18:04:23.660518 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:23.660360 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:04:23.660518 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660383 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6b5l5\" (UniqueName: \"kubernetes.io/projected/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-kube-api-access-6b5l5\") pod \"cluster-monitoring-operator-6667474d89-k7vp9\" (UID: \"8b0ce7ac-c20c-4958-8a1f-f646f39de6ab\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" Apr 16 18:04:23.660518 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:23.660392 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54c8fc9545-l6f7w: secret "image-registry-tls" not found Apr 16 18:04:23.660518 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660408 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-installation-pull-secrets\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.660518 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660436 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-bound-sa-token\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.660518 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:23.660463 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls podName:10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:24.160443245 +0000 UTC m=+144.437857134 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls") pod "image-registry-54c8fc9545-l6f7w" (UID: "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290") : secret "image-registry-tls" not found Apr 16 18:04:23.660518 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660494 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-ca-trust-extracted\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.660518 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660526 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35a6661a-e671-4470-a717-3ca52c801a37-config\") pod \"service-ca-operator-69965bb79d-rwh86\" (UID: \"35a6661a-e671-4470-a717-3ca52c801a37\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86" Apr 16 18:04:23.661057 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660550 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-image-registry-private-configuration\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.661057 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660579 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35a6661a-e671-4470-a717-3ca52c801a37-serving-cert\") pod \"service-ca-operator-69965bb79d-rwh86\" (UID: \"35a6661a-e671-4470-a717-3ca52c801a37\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86" Apr 16 18:04:23.661057 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660603 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5e4633a3-2eb6-4ec0-af69-c94c4487082b-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-kgn8f\" (UID: \"5e4633a3-2eb6-4ec0-af69-c94c4487082b\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f" Apr 16 18:04:23.661057 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660629 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvwwq\" (UniqueName: \"kubernetes.io/projected/35a6661a-e671-4470-a717-3ca52c801a37-kube-api-access-cvwwq\") pod \"service-ca-operator-69965bb79d-rwh86\" (UID: \"35a6661a-e671-4470-a717-3ca52c801a37\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86" Apr 16 18:04:23.661057 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.660845 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-certificates\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.661297 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.661117 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-ca-trust-extracted\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.662068 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:23.661502 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:23.662068 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:23.661606 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls podName:8b0ce7ac-c20c-4958-8a1f-f646f39de6ab nodeName:}" failed. No retries permitted until 2026-04-16 18:04:24.161586163 +0000 UTC m=+144.439000048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-k7vp9" (UID: "8b0ce7ac-c20c-4958-8a1f-f646f39de6ab") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:23.662068 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.661839 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5e4633a3-2eb6-4ec0-af69-c94c4487082b-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-kgn8f\" (UID: \"5e4633a3-2eb6-4ec0-af69-c94c4487082b\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f" Apr 16 18:04:23.662068 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.661841 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-k7vp9\" (UID: \"8b0ce7ac-c20c-4958-8a1f-f646f39de6ab\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" Apr 16 18:04:23.662068 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:23.662054 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:04:23.662403 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:23.662118 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert podName:5e4633a3-2eb6-4ec0-af69-c94c4487082b nodeName:}" failed. No retries permitted until 2026-04-16 18:04:24.162099896 +0000 UTC m=+144.439513769 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-kgn8f" (UID: "5e4633a3-2eb6-4ec0-af69-c94c4487082b") : secret "networking-console-plugin-cert" not found Apr 16 18:04:23.662403 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.662306 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35a6661a-e671-4470-a717-3ca52c801a37-config\") pod \"service-ca-operator-69965bb79d-rwh86\" (UID: \"35a6661a-e671-4470-a717-3ca52c801a37\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86" Apr 16 18:04:23.663410 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.663361 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-trusted-ca\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.664120 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.664096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35a6661a-e671-4470-a717-3ca52c801a37-serving-cert\") pod \"service-ca-operator-69965bb79d-rwh86\" (UID: \"35a6661a-e671-4470-a717-3ca52c801a37\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86" Apr 16 18:04:23.665328 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.665275 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-installation-pull-secrets\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.665505 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.665481 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-image-registry-private-configuration\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.676157 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.676106 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6xkm\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-kube-api-access-b6xkm\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.676269 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.676167 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvwwq\" (UniqueName: \"kubernetes.io/projected/35a6661a-e671-4470-a717-3ca52c801a37-kube-api-access-cvwwq\") pod \"service-ca-operator-69965bb79d-rwh86\" (UID: \"35a6661a-e671-4470-a717-3ca52c801a37\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86" Apr 16 18:04:23.678507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.678459 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmv6t\" (UniqueName: \"kubernetes.io/projected/d7f5c41a-d5cd-4717-b0c7-464ae63f0d81-kube-api-access-vmv6t\") pod \"network-check-source-7b678d77c7-bmn9v\" (UID: \"d7f5c41a-d5cd-4717-b0c7-464ae63f0d81\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-bmn9v" Apr 16 18:04:23.678662 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.678642 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b5l5\" (UniqueName: \"kubernetes.io/projected/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-kube-api-access-6b5l5\") pod \"cluster-monitoring-operator-6667474d89-k7vp9\" (UID: \"8b0ce7ac-c20c-4958-8a1f-f646f39de6ab\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" Apr 16 18:04:23.679162 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.679142 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-bound-sa-token\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:23.724945 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.724916 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86" Apr 16 18:04:23.736463 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.736441 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-bmn9v" Apr 16 18:04:23.743993 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.743968 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-gssjt"] Apr 16 18:04:23.747729 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:04:23.747698 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc84b07b_78de_4ae8_bb54_38e289deb5f1.slice/crio-fa0bfbde3e0c75929dfbfe83cd7dd97cdb6a316623350e8069000a69ed2050cf WatchSource:0}: Error finding container fa0bfbde3e0c75929dfbfe83cd7dd97cdb6a316623350e8069000a69ed2050cf: Status 404 returned error can't find the container with id fa0bfbde3e0c75929dfbfe83cd7dd97cdb6a316623350e8069000a69ed2050cf Apr 16 18:04:23.861308 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.861260 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86"] Apr 16 18:04:23.864519 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:04:23.864497 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35a6661a_e671_4470_a717_3ca52c801a37.slice/crio-d66788f66d9421b65e218d511a92a08af557b7d689763d7ccf622b0b749c76c7 WatchSource:0}: Error finding container d66788f66d9421b65e218d511a92a08af557b7d689763d7ccf622b0b749c76c7: Status 404 returned error can't find the container with id d66788f66d9421b65e218d511a92a08af557b7d689763d7ccf622b0b749c76c7 Apr 16 18:04:23.868285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:23.868261 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-bmn9v"] Apr 16 18:04:23.870899 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:04:23.870875 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7f5c41a_d5cd_4717_b0c7_464ae63f0d81.slice/crio-51b1efb23ed35115460859c31d43cacada60ab109b62c1769d69eddc10ac0855 WatchSource:0}: Error finding container 51b1efb23ed35115460859c31d43cacada60ab109b62c1769d69eddc10ac0855: Status 404 returned error can't find the container with id 51b1efb23ed35115460859c31d43cacada60ab109b62c1769d69eddc10ac0855 Apr 16 18:04:24.164998 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:24.164906 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:24.164998 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:24.164975 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-kgn8f\" (UID: \"5e4633a3-2eb6-4ec0-af69-c94c4487082b\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f" Apr 16 18:04:24.165209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:24.165014 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-k7vp9\" (UID: \"8b0ce7ac-c20c-4958-8a1f-f646f39de6ab\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" Apr 16 18:04:24.165209 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:24.165079 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:04:24.165209 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:24.165100 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54c8fc9545-l6f7w: secret "image-registry-tls" not found Apr 16 18:04:24.165209 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:24.165135 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:24.165209 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:24.165147 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:04:24.165209 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:24.165168 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls podName:10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:25.165148577 +0000 UTC m=+145.442562451 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls") pod "image-registry-54c8fc9545-l6f7w" (UID: "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290") : secret "image-registry-tls" not found Apr 16 18:04:24.165209 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:24.165189 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls podName:8b0ce7ac-c20c-4958-8a1f-f646f39de6ab nodeName:}" failed. No retries permitted until 2026-04-16 18:04:25.165178936 +0000 UTC m=+145.442592811 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-k7vp9" (UID: "8b0ce7ac-c20c-4958-8a1f-f646f39de6ab") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:24.165209 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:24.165204 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert podName:5e4633a3-2eb6-4ec0-af69-c94c4487082b nodeName:}" failed. No retries permitted until 2026-04-16 18:04:25.165195848 +0000 UTC m=+145.442609732 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-kgn8f" (UID: "5e4633a3-2eb6-4ec0-af69-c94c4487082b") : secret "networking-console-plugin-cert" not found Apr 16 18:04:24.647564 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:24.647501 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-bmn9v" event={"ID":"d7f5c41a-d5cd-4717-b0c7-464ae63f0d81","Type":"ContainerStarted","Data":"7c28191e654de3d1ad4efb13fe5d87f848bd121f41f1e4f71a3a9102b4e58492"} Apr 16 18:04:24.647564 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:24.647541 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-bmn9v" event={"ID":"d7f5c41a-d5cd-4717-b0c7-464ae63f0d81","Type":"ContainerStarted","Data":"51b1efb23ed35115460859c31d43cacada60ab109b62c1769d69eddc10ac0855"} Apr 16 18:04:24.649342 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:24.649305 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" event={"ID":"dc84b07b-78de-4ae8-bb54-38e289deb5f1","Type":"ContainerStarted","Data":"fa0bfbde3e0c75929dfbfe83cd7dd97cdb6a316623350e8069000a69ed2050cf"} Apr 16 18:04:24.650632 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:24.650589 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86" event={"ID":"35a6661a-e671-4470-a717-3ca52c801a37","Type":"ContainerStarted","Data":"d66788f66d9421b65e218d511a92a08af557b7d689763d7ccf622b0b749c76c7"} Apr 16 18:04:24.665940 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:24.665885 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-bmn9v" podStartSLOduration=1.665869633 podStartE2EDuration="1.665869633s" podCreationTimestamp="2026-04-16 18:04:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:04:24.664690446 +0000 UTC m=+144.942104345" watchObservedRunningTime="2026-04-16 18:04:24.665869633 +0000 UTC m=+144.943283529" Apr 16 18:04:25.174121 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:25.174085 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:25.174287 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:25.174252 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-kgn8f\" (UID: \"5e4633a3-2eb6-4ec0-af69-c94c4487082b\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f" Apr 16 18:04:25.174343 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:25.174258 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:04:25.174343 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:25.174324 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:04:25.174446 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:25.174341 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54c8fc9545-l6f7w: secret "image-registry-tls" not found Apr 16 18:04:25.174446 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:25.174313 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-k7vp9\" (UID: \"8b0ce7ac-c20c-4958-8a1f-f646f39de6ab\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" Apr 16 18:04:25.174446 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:25.174404 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:25.174446 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:25.174411 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert podName:5e4633a3-2eb6-4ec0-af69-c94c4487082b nodeName:}" failed. No retries permitted until 2026-04-16 18:04:27.174390759 +0000 UTC m=+147.451804638 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-kgn8f" (UID: "5e4633a3-2eb6-4ec0-af69-c94c4487082b") : secret "networking-console-plugin-cert" not found Apr 16 18:04:25.174446 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:25.174435 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls podName:10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:27.174425888 +0000 UTC m=+147.451839765 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls") pod "image-registry-54c8fc9545-l6f7w" (UID: "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290") : secret "image-registry-tls" not found Apr 16 18:04:25.174665 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:25.174461 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls podName:8b0ce7ac-c20c-4958-8a1f-f646f39de6ab nodeName:}" failed. No retries permitted until 2026-04-16 18:04:27.174445388 +0000 UTC m=+147.451859268 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-k7vp9" (UID: "8b0ce7ac-c20c-4958-8a1f-f646f39de6ab") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:26.657892 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:26.657871 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-gssjt_dc84b07b-78de-4ae8-bb54-38e289deb5f1/console-operator/0.log" Apr 16 18:04:26.658318 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:26.657905 2574 generic.go:358] "Generic (PLEG): container finished" podID="dc84b07b-78de-4ae8-bb54-38e289deb5f1" containerID="34f5c7aff4730daa72ddaa95160af3aedae9d83c89b467f017f2db0ffa1b6041" exitCode=255 Apr 16 18:04:26.658318 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:26.657973 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" event={"ID":"dc84b07b-78de-4ae8-bb54-38e289deb5f1","Type":"ContainerDied","Data":"34f5c7aff4730daa72ddaa95160af3aedae9d83c89b467f017f2db0ffa1b6041"} Apr 16 18:04:26.658318 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:26.658170 2574 scope.go:117] "RemoveContainer" containerID="34f5c7aff4730daa72ddaa95160af3aedae9d83c89b467f017f2db0ffa1b6041" Apr 16 18:04:26.659519 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:26.659491 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86" event={"ID":"35a6661a-e671-4470-a717-3ca52c801a37","Type":"ContainerStarted","Data":"4f7207e8b043fbc9d0835d4e3c32e210aaacf123479c2177a52f8f182d1b4137"} Apr 16 18:04:26.729050 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:26.729007 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86" podStartSLOduration=1.583770052 podStartE2EDuration="3.728988962s" podCreationTimestamp="2026-04-16 18:04:23 +0000 UTC" firstStartedPulling="2026-04-16 18:04:23.866343987 +0000 UTC m=+144.143757863" lastFinishedPulling="2026-04-16 18:04:26.011562901 +0000 UTC m=+146.288976773" observedRunningTime="2026-04-16 18:04:26.728142648 +0000 UTC m=+147.005556543" watchObservedRunningTime="2026-04-16 18:04:26.728988962 +0000 UTC m=+147.006402834" Apr 16 18:04:27.190338 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:27.190301 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:27.190496 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:27.190391 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-kgn8f\" (UID: \"5e4633a3-2eb6-4ec0-af69-c94c4487082b\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f" Apr 16 18:04:27.190496 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:27.190427 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-k7vp9\" (UID: \"8b0ce7ac-c20c-4958-8a1f-f646f39de6ab\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" Apr 16 18:04:27.190496 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:27.190481 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:04:27.190648 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:27.190503 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54c8fc9545-l6f7w: secret "image-registry-tls" not found Apr 16 18:04:27.190648 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:27.190522 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:04:27.190648 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:27.190544 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:27.190648 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:27.190571 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls podName:10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:31.190550051 +0000 UTC m=+151.467963930 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls") pod "image-registry-54c8fc9545-l6f7w" (UID: "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290") : secret "image-registry-tls" not found Apr 16 18:04:27.190648 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:27.190594 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls podName:8b0ce7ac-c20c-4958-8a1f-f646f39de6ab nodeName:}" failed. No retries permitted until 2026-04-16 18:04:31.190577408 +0000 UTC m=+151.467991296 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-k7vp9" (UID: "8b0ce7ac-c20c-4958-8a1f-f646f39de6ab") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:27.190648 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:27.190611 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert podName:5e4633a3-2eb6-4ec0-af69-c94c4487082b nodeName:}" failed. No retries permitted until 2026-04-16 18:04:31.190601646 +0000 UTC m=+151.468015525 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-kgn8f" (UID: "5e4633a3-2eb6-4ec0-af69-c94c4487082b") : secret "networking-console-plugin-cert" not found Apr 16 18:04:27.663362 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:27.663286 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-gssjt_dc84b07b-78de-4ae8-bb54-38e289deb5f1/console-operator/1.log" Apr 16 18:04:27.663728 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:27.663677 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-gssjt_dc84b07b-78de-4ae8-bb54-38e289deb5f1/console-operator/0.log" Apr 16 18:04:27.663728 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:27.663708 2574 generic.go:358] "Generic (PLEG): container finished" podID="dc84b07b-78de-4ae8-bb54-38e289deb5f1" containerID="008034846b3516d3ebae453f57dfad43508a779d6a06ecb3098cb282f33a00ed" exitCode=255 Apr 16 18:04:27.663839 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:27.663802 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" event={"ID":"dc84b07b-78de-4ae8-bb54-38e289deb5f1","Type":"ContainerDied","Data":"008034846b3516d3ebae453f57dfad43508a779d6a06ecb3098cb282f33a00ed"} Apr 16 18:04:27.663896 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:27.663847 2574 scope.go:117] "RemoveContainer" containerID="34f5c7aff4730daa72ddaa95160af3aedae9d83c89b467f017f2db0ffa1b6041" Apr 16 18:04:27.664067 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:27.664052 2574 scope.go:117] "RemoveContainer" containerID="008034846b3516d3ebae453f57dfad43508a779d6a06ecb3098cb282f33a00ed" Apr 16 18:04:27.664273 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:27.664252 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-gssjt_openshift-console-operator(dc84b07b-78de-4ae8-bb54-38e289deb5f1)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" podUID="dc84b07b-78de-4ae8-bb54-38e289deb5f1" Apr 16 18:04:28.667731 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:28.667700 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-gssjt_dc84b07b-78de-4ae8-bb54-38e289deb5f1/console-operator/1.log" Apr 16 18:04:28.668227 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:28.668072 2574 scope.go:117] "RemoveContainer" containerID="008034846b3516d3ebae453f57dfad43508a779d6a06ecb3098cb282f33a00ed" Apr 16 18:04:28.668273 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:28.668249 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-gssjt_openshift-console-operator(dc84b07b-78de-4ae8-bb54-38e289deb5f1)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" podUID="dc84b07b-78de-4ae8-bb54-38e289deb5f1" Apr 16 18:04:28.881204 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:28.881178 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2nlc5_2f391f14-a93d-421b-8f8c-642ea53a1269/dns-node-resolver/0.log" Apr 16 18:04:29.880384 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:29.880338 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qwbnl_9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b/node-ca/0.log" Apr 16 18:04:29.936173 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:29.936142 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-774s5"] Apr 16 18:04:29.939960 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:29.939944 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-774s5" Apr 16 18:04:29.941887 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:29.941862 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 18:04:29.941887 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:29.941882 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-htzcl\"" Apr 16 18:04:29.942284 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:29.942263 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 18:04:29.942424 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:29.942292 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 18:04:29.942424 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:29.942330 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 18:04:29.947773 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:29.947753 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-774s5"] Apr 16 18:04:30.111924 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:30.111879 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7ad1b6a7-e327-47cf-8623-79c19086e85d-signing-key\") pod \"service-ca-bfc587fb7-774s5\" (UID: \"7ad1b6a7-e327-47cf-8623-79c19086e85d\") " pod="openshift-service-ca/service-ca-bfc587fb7-774s5" Apr 16 18:04:30.112076 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:30.111935 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7ad1b6a7-e327-47cf-8623-79c19086e85d-signing-cabundle\") pod \"service-ca-bfc587fb7-774s5\" (UID: \"7ad1b6a7-e327-47cf-8623-79c19086e85d\") " pod="openshift-service-ca/service-ca-bfc587fb7-774s5" Apr 16 18:04:30.112076 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:30.112010 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb2pf\" (UniqueName: \"kubernetes.io/projected/7ad1b6a7-e327-47cf-8623-79c19086e85d-kube-api-access-xb2pf\") pod \"service-ca-bfc587fb7-774s5\" (UID: \"7ad1b6a7-e327-47cf-8623-79c19086e85d\") " pod="openshift-service-ca/service-ca-bfc587fb7-774s5" Apr 16 18:04:30.213174 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:30.213148 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7ad1b6a7-e327-47cf-8623-79c19086e85d-signing-key\") pod \"service-ca-bfc587fb7-774s5\" (UID: \"7ad1b6a7-e327-47cf-8623-79c19086e85d\") " pod="openshift-service-ca/service-ca-bfc587fb7-774s5" Apr 16 18:04:30.213285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:30.213186 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7ad1b6a7-e327-47cf-8623-79c19086e85d-signing-cabundle\") pod \"service-ca-bfc587fb7-774s5\" (UID: \"7ad1b6a7-e327-47cf-8623-79c19086e85d\") " pod="openshift-service-ca/service-ca-bfc587fb7-774s5" Apr 16 18:04:30.213285 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:30.213211 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xb2pf\" (UniqueName: \"kubernetes.io/projected/7ad1b6a7-e327-47cf-8623-79c19086e85d-kube-api-access-xb2pf\") pod \"service-ca-bfc587fb7-774s5\" (UID: \"7ad1b6a7-e327-47cf-8623-79c19086e85d\") " pod="openshift-service-ca/service-ca-bfc587fb7-774s5" Apr 16 18:04:30.213780 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:30.213761 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7ad1b6a7-e327-47cf-8623-79c19086e85d-signing-cabundle\") pod \"service-ca-bfc587fb7-774s5\" (UID: \"7ad1b6a7-e327-47cf-8623-79c19086e85d\") " pod="openshift-service-ca/service-ca-bfc587fb7-774s5" Apr 16 18:04:30.215987 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:30.215968 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7ad1b6a7-e327-47cf-8623-79c19086e85d-signing-key\") pod \"service-ca-bfc587fb7-774s5\" (UID: \"7ad1b6a7-e327-47cf-8623-79c19086e85d\") " pod="openshift-service-ca/service-ca-bfc587fb7-774s5" Apr 16 18:04:30.220699 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:30.220681 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb2pf\" (UniqueName: \"kubernetes.io/projected/7ad1b6a7-e327-47cf-8623-79c19086e85d-kube-api-access-xb2pf\") pod \"service-ca-bfc587fb7-774s5\" (UID: \"7ad1b6a7-e327-47cf-8623-79c19086e85d\") " pod="openshift-service-ca/service-ca-bfc587fb7-774s5" Apr 16 18:04:30.249608 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:30.249589 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-774s5" Apr 16 18:04:30.362047 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:30.362022 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-774s5"] Apr 16 18:04:30.364827 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:04:30.364798 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ad1b6a7_e327_47cf_8623_79c19086e85d.slice/crio-77efe78605ce7ce5720e1d419cc625e68e6b30d9566a1ab7dd6d2202a7709dd3 WatchSource:0}: Error finding container 77efe78605ce7ce5720e1d419cc625e68e6b30d9566a1ab7dd6d2202a7709dd3: Status 404 returned error can't find the container with id 77efe78605ce7ce5720e1d419cc625e68e6b30d9566a1ab7dd6d2202a7709dd3 Apr 16 18:04:30.674496 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:30.674463 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-774s5" event={"ID":"7ad1b6a7-e327-47cf-8623-79c19086e85d","Type":"ContainerStarted","Data":"484ba44b1a36923ef1eb99b3f4a1c0816abce359d0987c83ee137964d40afd87"} Apr 16 18:04:30.674496 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:30.674499 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-774s5" event={"ID":"7ad1b6a7-e327-47cf-8623-79c19086e85d","Type":"ContainerStarted","Data":"77efe78605ce7ce5720e1d419cc625e68e6b30d9566a1ab7dd6d2202a7709dd3"} Apr 16 18:04:30.698714 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:30.698674 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-774s5" podStartSLOduration=1.698660695 podStartE2EDuration="1.698660695s" podCreationTimestamp="2026-04-16 18:04:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:04:30.697124614 +0000 UTC m=+150.974538509" watchObservedRunningTime="2026-04-16 18:04:30.698660695 +0000 UTC m=+150.976074590" Apr 16 18:04:31.220680 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:31.220647 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-kgn8f\" (UID: \"5e4633a3-2eb6-4ec0-af69-c94c4487082b\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f" Apr 16 18:04:31.221137 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:31.220689 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-k7vp9\" (UID: \"8b0ce7ac-c20c-4958-8a1f-f646f39de6ab\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" Apr 16 18:04:31.221137 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:31.220740 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:31.221137 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:31.220819 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:04:31.221137 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:31.220838 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:04:31.221137 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:31.220849 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54c8fc9545-l6f7w: secret "image-registry-tls" not found Apr 16 18:04:31.221137 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:31.220860 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:31.221137 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:31.220897 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls podName:10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:39.220882554 +0000 UTC m=+159.498296443 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls") pod "image-registry-54c8fc9545-l6f7w" (UID: "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290") : secret "image-registry-tls" not found Apr 16 18:04:31.221137 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:31.220911 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert podName:5e4633a3-2eb6-4ec0-af69-c94c4487082b nodeName:}" failed. No retries permitted until 2026-04-16 18:04:39.220904774 +0000 UTC m=+159.498318646 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-kgn8f" (UID: "5e4633a3-2eb6-4ec0-af69-c94c4487082b") : secret "networking-console-plugin-cert" not found Apr 16 18:04:31.221137 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:31.220935 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls podName:8b0ce7ac-c20c-4958-8a1f-f646f39de6ab nodeName:}" failed. No retries permitted until 2026-04-16 18:04:39.22091544 +0000 UTC m=+159.498329315 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-k7vp9" (UID: "8b0ce7ac-c20c-4958-8a1f-f646f39de6ab") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:33.613577 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:33.613542 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:33.613910 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:33.613587 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:33.614011 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:33.613997 2574 scope.go:117] "RemoveContainer" containerID="008034846b3516d3ebae453f57dfad43508a779d6a06ecb3098cb282f33a00ed" Apr 16 18:04:33.614212 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:33.614193 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-gssjt_openshift-console-operator(dc84b07b-78de-4ae8-bb54-38e289deb5f1)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" podUID="dc84b07b-78de-4ae8-bb54-38e289deb5f1" Apr 16 18:04:36.136847 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:36.136798 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-btqpr" podUID="567b1659-cf99-4ad6-aad7-99460710d869" Apr 16 18:04:36.157017 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:36.156975 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-z9628" podUID="6e8ad75a-91e2-49d1-b5b3-8f18c05f867d" Apr 16 18:04:36.689520 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:36.689492 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-btqpr" Apr 16 18:04:36.689671 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:36.689496 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z9628" Apr 16 18:04:37.309736 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:37.309705 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-24pqv" podUID="9d69a3dd-c0fd-4764-b3d8-802189b16640" Apr 16 18:04:39.285276 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:39.285244 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:39.285669 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:39.285293 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-kgn8f\" (UID: \"5e4633a3-2eb6-4ec0-af69-c94c4487082b\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f" Apr 16 18:04:39.285669 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:39.285318 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-k7vp9\" (UID: \"8b0ce7ac-c20c-4958-8a1f-f646f39de6ab\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" Apr 16 18:04:39.285669 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:39.285421 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:04:39.285669 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:04:39.285525 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert podName:5e4633a3-2eb6-4ec0-af69-c94c4487082b nodeName:}" failed. No retries permitted until 2026-04-16 18:04:55.285509314 +0000 UTC m=+175.562923186 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-kgn8f" (UID: "5e4633a3-2eb6-4ec0-af69-c94c4487082b") : secret "networking-console-plugin-cert" not found Apr 16 18:04:39.287581 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:39.287559 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b0ce7ac-c20c-4958-8a1f-f646f39de6ab-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-k7vp9\" (UID: \"8b0ce7ac-c20c-4958-8a1f-f646f39de6ab\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" Apr 16 18:04:39.287788 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:39.287768 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls\") pod \"image-registry-54c8fc9545-l6f7w\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:39.330785 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:39.330760 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" Apr 16 18:04:39.345521 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:39.345501 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:39.475568 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:39.475540 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9"] Apr 16 18:04:39.479601 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:04:39.479556 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b0ce7ac_c20c_4958_8a1f_f646f39de6ab.slice/crio-4825a80460431d4aeb3e8a4cea99f2cc37ce9192fc34956268cb7f4bb97c5fd3 WatchSource:0}: Error finding container 4825a80460431d4aeb3e8a4cea99f2cc37ce9192fc34956268cb7f4bb97c5fd3: Status 404 returned error can't find the container with id 4825a80460431d4aeb3e8a4cea99f2cc37ce9192fc34956268cb7f4bb97c5fd3 Apr 16 18:04:39.498312 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:39.498289 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54c8fc9545-l6f7w"] Apr 16 18:04:39.501618 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:04:39.501588 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c9c7b3_7baa_4a5f_9b87_9e76bbd6a290.slice/crio-5caeb20b9503bdee4368d320559074ae8b34ede63db04efe221c8876e9f7dfde WatchSource:0}: Error finding container 5caeb20b9503bdee4368d320559074ae8b34ede63db04efe221c8876e9f7dfde: Status 404 returned error can't find the container with id 5caeb20b9503bdee4368d320559074ae8b34ede63db04efe221c8876e9f7dfde Apr 16 18:04:39.697975 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:39.697937 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" event={"ID":"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290","Type":"ContainerStarted","Data":"6d0d7072ad6e1d6e9e27b3234fe46447cb4849e4c8b46466dc656c4b53d70c3a"} Apr 16 18:04:39.698142 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:39.697987 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" event={"ID":"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290","Type":"ContainerStarted","Data":"5caeb20b9503bdee4368d320559074ae8b34ede63db04efe221c8876e9f7dfde"} Apr 16 18:04:39.698142 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:39.698043 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:04:39.698974 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:39.698953 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" event={"ID":"8b0ce7ac-c20c-4958-8a1f-f646f39de6ab","Type":"ContainerStarted","Data":"4825a80460431d4aeb3e8a4cea99f2cc37ce9192fc34956268cb7f4bb97c5fd3"} Apr 16 18:04:39.723856 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:39.723816 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" podStartSLOduration=16.72380355 podStartE2EDuration="16.72380355s" podCreationTimestamp="2026-04-16 18:04:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:04:39.722524905 +0000 UTC m=+159.999938800" watchObservedRunningTime="2026-04-16 18:04:39.72380355 +0000 UTC m=+160.001217443" Apr 16 18:04:41.001257 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:41.001215 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:04:41.003822 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:41.003798 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/567b1659-cf99-4ad6-aad7-99460710d869-metrics-tls\") pod \"dns-default-btqpr\" (UID: \"567b1659-cf99-4ad6-aad7-99460710d869\") " pod="openshift-dns/dns-default-btqpr" Apr 16 18:04:41.102186 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:41.102165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert\") pod \"ingress-canary-z9628\" (UID: \"6e8ad75a-91e2-49d1-b5b3-8f18c05f867d\") " pod="openshift-ingress-canary/ingress-canary-z9628" Apr 16 18:04:41.104440 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:41.104419 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8ad75a-91e2-49d1-b5b3-8f18c05f867d-cert\") pod \"ingress-canary-z9628\" (UID: \"6e8ad75a-91e2-49d1-b5b3-8f18c05f867d\") " pod="openshift-ingress-canary/ingress-canary-z9628" Apr 16 18:04:41.192610 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:41.192567 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-p7xs6\"" Apr 16 18:04:41.192908 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:41.192891 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lh6dc\"" Apr 16 18:04:41.201431 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:41.201412 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-btqpr" Apr 16 18:04:41.201518 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:41.201492 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z9628" Apr 16 18:04:41.398517 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:41.398490 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z9628"] Apr 16 18:04:41.403037 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:04:41.403001 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e8ad75a_91e2_49d1_b5b3_8f18c05f867d.slice/crio-d8c197f751c618c856744de08c9ce51fea8e0b961e6f2a2c21a43a8ba6f1f5b2 WatchSource:0}: Error finding container d8c197f751c618c856744de08c9ce51fea8e0b961e6f2a2c21a43a8ba6f1f5b2: Status 404 returned error can't find the container with id d8c197f751c618c856744de08c9ce51fea8e0b961e6f2a2c21a43a8ba6f1f5b2 Apr 16 18:04:41.416642 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:41.416620 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-btqpr"] Apr 16 18:04:41.419774 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:04:41.419742 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod567b1659_cf99_4ad6_aad7_99460710d869.slice/crio-3f0c3c87164c1d5618238e3b9d41244505551fae673c3d843ed8787da1d754e1 WatchSource:0}: Error finding container 3f0c3c87164c1d5618238e3b9d41244505551fae673c3d843ed8787da1d754e1: Status 404 returned error can't find the container with id 3f0c3c87164c1d5618238e3b9d41244505551fae673c3d843ed8787da1d754e1 Apr 16 18:04:41.705120 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:41.705085 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z9628" event={"ID":"6e8ad75a-91e2-49d1-b5b3-8f18c05f867d","Type":"ContainerStarted","Data":"d8c197f751c618c856744de08c9ce51fea8e0b961e6f2a2c21a43a8ba6f1f5b2"} Apr 16 18:04:41.706534 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:41.706499 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" event={"ID":"8b0ce7ac-c20c-4958-8a1f-f646f39de6ab","Type":"ContainerStarted","Data":"b0b124caee5eae1cde4fc0a610447049f283ca30c7e88ce838b4901a9ebd7c48"} Apr 16 18:04:41.707559 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:41.707535 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-btqpr" event={"ID":"567b1659-cf99-4ad6-aad7-99460710d869","Type":"ContainerStarted","Data":"3f0c3c87164c1d5618238e3b9d41244505551fae673c3d843ed8787da1d754e1"} Apr 16 18:04:41.727476 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:41.727433 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-k7vp9" podStartSLOduration=17.109940595 podStartE2EDuration="18.727419762s" podCreationTimestamp="2026-04-16 18:04:23 +0000 UTC" firstStartedPulling="2026-04-16 18:04:39.481403625 +0000 UTC m=+159.758817509" lastFinishedPulling="2026-04-16 18:04:41.098882794 +0000 UTC m=+161.376296676" observedRunningTime="2026-04-16 18:04:41.726853112 +0000 UTC m=+162.004267008" watchObservedRunningTime="2026-04-16 18:04:41.727419762 +0000 UTC m=+162.004833652" Apr 16 18:04:43.714257 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:43.714220 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z9628" event={"ID":"6e8ad75a-91e2-49d1-b5b3-8f18c05f867d","Type":"ContainerStarted","Data":"08bbdc525b9c0064a94d66347737ab81f73ac39be681a96e669b6e8ab4a57b28"} Apr 16 18:04:43.715777 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:43.715753 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-btqpr" event={"ID":"567b1659-cf99-4ad6-aad7-99460710d869","Type":"ContainerStarted","Data":"a12d7cdde0e2314055f59ec746b84e7d7ad61597bcc8448cb499d35b45e4e13f"} Apr 16 18:04:43.715871 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:43.715784 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-btqpr" event={"ID":"567b1659-cf99-4ad6-aad7-99460710d869","Type":"ContainerStarted","Data":"07d1dc23c86dafb02fd46c9153f9a64f9f392bf660564f1db08927c6366087ba"} Apr 16 18:04:43.715922 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:43.715891 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-btqpr" Apr 16 18:04:43.753781 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:43.753743 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-btqpr" podStartSLOduration=128.929745649 podStartE2EDuration="2m10.753730954s" podCreationTimestamp="2026-04-16 18:02:33 +0000 UTC" firstStartedPulling="2026-04-16 18:04:41.422099316 +0000 UTC m=+161.699513187" lastFinishedPulling="2026-04-16 18:04:43.24608462 +0000 UTC m=+163.523498492" observedRunningTime="2026-04-16 18:04:43.752449336 +0000 UTC m=+164.029863230" watchObservedRunningTime="2026-04-16 18:04:43.753730954 +0000 UTC m=+164.031144847" Apr 16 18:04:43.754289 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:43.754262 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-z9628" podStartSLOduration=128.911350401 podStartE2EDuration="2m10.754256312s" podCreationTimestamp="2026-04-16 18:02:33 +0000 UTC" firstStartedPulling="2026-04-16 18:04:41.405183507 +0000 UTC m=+161.682597394" lastFinishedPulling="2026-04-16 18:04:43.248089419 +0000 UTC m=+163.525503305" observedRunningTime="2026-04-16 18:04:43.732420537 +0000 UTC m=+164.009834431" watchObservedRunningTime="2026-04-16 18:04:43.754256312 +0000 UTC m=+164.031670200" Apr 16 18:04:47.291598 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:47.291504 2574 scope.go:117] "RemoveContainer" containerID="008034846b3516d3ebae453f57dfad43508a779d6a06ecb3098cb282f33a00ed" Apr 16 18:04:47.731608 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:47.731583 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-gssjt_dc84b07b-78de-4ae8-bb54-38e289deb5f1/console-operator/1.log" Apr 16 18:04:47.731744 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:47.731670 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" event={"ID":"dc84b07b-78de-4ae8-bb54-38e289deb5f1","Type":"ContainerStarted","Data":"70eb7b21bad110af498da6d65686e42b54889b96dd66bb3a5b8b1c373ce5eacd"} Apr 16 18:04:47.731955 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:47.731936 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:47.750435 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:47.750393 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" podStartSLOduration=22.491044152 podStartE2EDuration="24.750359106s" podCreationTimestamp="2026-04-16 18:04:23 +0000 UTC" firstStartedPulling="2026-04-16 18:04:23.749448452 +0000 UTC m=+144.026862324" lastFinishedPulling="2026-04-16 18:04:26.008763404 +0000 UTC m=+146.286177278" observedRunningTime="2026-04-16 18:04:47.749628292 +0000 UTC m=+168.027042185" watchObservedRunningTime="2026-04-16 18:04:47.750359106 +0000 UTC m=+168.027773000" Apr 16 18:04:47.884845 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:47.884819 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-gssjt" Apr 16 18:04:48.291601 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:48.291562 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:04:52.296710 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.296674 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-54c8fc9545-l6f7w"] Apr 16 18:04:52.297729 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.297703 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-shshw"] Apr 16 18:04:52.299682 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.299668 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-shshw" Apr 16 18:04:52.304921 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.304900 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:04:52.305181 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.305165 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:04:52.305535 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.305460 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:04:52.305535 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.305506 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vblvg\"" Apr 16 18:04:52.305709 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.305506 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:04:52.322061 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.322043 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-shshw"] Apr 16 18:04:52.349500 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.349477 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5ddc6d5d76-tvmvt"] Apr 16 18:04:52.351393 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.351364 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.368147 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.368126 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5ddc6d5d76-tvmvt"] Apr 16 18:04:52.381484 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.381465 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/84abdb7e-6fd5-451b-ace1-130db696d178-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-shshw\" (UID: \"84abdb7e-6fd5-451b-ace1-130db696d178\") " pod="openshift-insights/insights-runtime-extractor-shshw" Apr 16 18:04:52.381573 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.381502 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/84abdb7e-6fd5-451b-ace1-130db696d178-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-shshw\" (UID: \"84abdb7e-6fd5-451b-ace1-130db696d178\") " pod="openshift-insights/insights-runtime-extractor-shshw" Apr 16 18:04:52.381573 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.381521 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/97b7c142-016e-4ff4-a801-dab62c30ca70-image-registry-private-configuration\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.381573 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.381539 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/84abdb7e-6fd5-451b-ace1-130db696d178-crio-socket\") pod \"insights-runtime-extractor-shshw\" (UID: \"84abdb7e-6fd5-451b-ace1-130db696d178\") " pod="openshift-insights/insights-runtime-extractor-shshw" Apr 16 18:04:52.381670 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.381585 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97b7c142-016e-4ff4-a801-dab62c30ca70-bound-sa-token\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.381670 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.381617 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54qf6\" (UniqueName: \"kubernetes.io/projected/84abdb7e-6fd5-451b-ace1-130db696d178-kube-api-access-54qf6\") pod \"insights-runtime-extractor-shshw\" (UID: \"84abdb7e-6fd5-451b-ace1-130db696d178\") " pod="openshift-insights/insights-runtime-extractor-shshw" Apr 16 18:04:52.381670 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.381635 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97b7c142-016e-4ff4-a801-dab62c30ca70-trusted-ca\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.381670 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.381664 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97b7c142-016e-4ff4-a801-dab62c30ca70-ca-trust-extracted\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.381779 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.381689 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97b7c142-016e-4ff4-a801-dab62c30ca70-registry-tls\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.381779 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.381705 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97b7c142-016e-4ff4-a801-dab62c30ca70-registry-certificates\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.381779 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.381747 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/84abdb7e-6fd5-451b-ace1-130db696d178-data-volume\") pod \"insights-runtime-extractor-shshw\" (UID: \"84abdb7e-6fd5-451b-ace1-130db696d178\") " pod="openshift-insights/insights-runtime-extractor-shshw" Apr 16 18:04:52.381779 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.381773 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97b7c142-016e-4ff4-a801-dab62c30ca70-installation-pull-secrets\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.381901 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.381828 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chddj\" (UniqueName: \"kubernetes.io/projected/97b7c142-016e-4ff4-a801-dab62c30ca70-kube-api-access-chddj\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.482225 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.482203 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chddj\" (UniqueName: \"kubernetes.io/projected/97b7c142-016e-4ff4-a801-dab62c30ca70-kube-api-access-chddj\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.482310 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.482238 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/84abdb7e-6fd5-451b-ace1-130db696d178-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-shshw\" (UID: \"84abdb7e-6fd5-451b-ace1-130db696d178\") " pod="openshift-insights/insights-runtime-extractor-shshw" Apr 16 18:04:52.482310 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.482264 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/84abdb7e-6fd5-451b-ace1-130db696d178-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-shshw\" (UID: \"84abdb7e-6fd5-451b-ace1-130db696d178\") " pod="openshift-insights/insights-runtime-extractor-shshw" Apr 16 18:04:52.482419 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.482398 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/97b7c142-016e-4ff4-a801-dab62c30ca70-image-registry-private-configuration\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.482463 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.482439 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/84abdb7e-6fd5-451b-ace1-130db696d178-crio-socket\") pod \"insights-runtime-extractor-shshw\" (UID: \"84abdb7e-6fd5-451b-ace1-130db696d178\") " pod="openshift-insights/insights-runtime-extractor-shshw" Apr 16 18:04:52.482506 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.482471 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97b7c142-016e-4ff4-a801-dab62c30ca70-bound-sa-token\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.482544 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.482501 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54qf6\" (UniqueName: \"kubernetes.io/projected/84abdb7e-6fd5-451b-ace1-130db696d178-kube-api-access-54qf6\") pod \"insights-runtime-extractor-shshw\" (UID: \"84abdb7e-6fd5-451b-ace1-130db696d178\") " pod="openshift-insights/insights-runtime-extractor-shshw" Apr 16 18:04:52.482594 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.482568 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/84abdb7e-6fd5-451b-ace1-130db696d178-crio-socket\") pod \"insights-runtime-extractor-shshw\" (UID: \"84abdb7e-6fd5-451b-ace1-130db696d178\") " pod="openshift-insights/insights-runtime-extractor-shshw" Apr 16 18:04:52.482651 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.482616 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97b7c142-016e-4ff4-a801-dab62c30ca70-trusted-ca\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.482702 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.482652 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97b7c142-016e-4ff4-a801-dab62c30ca70-ca-trust-extracted\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.482702 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.482683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97b7c142-016e-4ff4-a801-dab62c30ca70-registry-tls\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.482800 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.482710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97b7c142-016e-4ff4-a801-dab62c30ca70-registry-certificates\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.482852 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.482826 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/84abdb7e-6fd5-451b-ace1-130db696d178-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-shshw\" (UID: \"84abdb7e-6fd5-451b-ace1-130db696d178\") " pod="openshift-insights/insights-runtime-extractor-shshw" Apr 16 18:04:52.482955 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.482929 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/84abdb7e-6fd5-451b-ace1-130db696d178-data-volume\") pod \"insights-runtime-extractor-shshw\" (UID: \"84abdb7e-6fd5-451b-ace1-130db696d178\") " pod="openshift-insights/insights-runtime-extractor-shshw" Apr 16 18:04:52.483134 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.483097 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97b7c142-016e-4ff4-a801-dab62c30ca70-installation-pull-secrets\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.483311 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.483269 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/84abdb7e-6fd5-451b-ace1-130db696d178-data-volume\") pod \"insights-runtime-extractor-shshw\" (UID: \"84abdb7e-6fd5-451b-ace1-130db696d178\") " pod="openshift-insights/insights-runtime-extractor-shshw" Apr 16 18:04:52.483311 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.483100 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97b7c142-016e-4ff4-a801-dab62c30ca70-ca-trust-extracted\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.483794 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.483774 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97b7c142-016e-4ff4-a801-dab62c30ca70-trusted-ca\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.484228 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.484211 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97b7c142-016e-4ff4-a801-dab62c30ca70-registry-certificates\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.485084 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.485066 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97b7c142-016e-4ff4-a801-dab62c30ca70-registry-tls\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.485181 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.485164 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/84abdb7e-6fd5-451b-ace1-130db696d178-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-shshw\" (UID: \"84abdb7e-6fd5-451b-ace1-130db696d178\") " pod="openshift-insights/insights-runtime-extractor-shshw" Apr 16 18:04:52.485245 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.485232 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/97b7c142-016e-4ff4-a801-dab62c30ca70-image-registry-private-configuration\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.485496 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.485478 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97b7c142-016e-4ff4-a801-dab62c30ca70-installation-pull-secrets\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.497947 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.497924 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chddj\" (UniqueName: \"kubernetes.io/projected/97b7c142-016e-4ff4-a801-dab62c30ca70-kube-api-access-chddj\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.499288 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.499269 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97b7c142-016e-4ff4-a801-dab62c30ca70-bound-sa-token\") pod \"image-registry-5ddc6d5d76-tvmvt\" (UID: \"97b7c142-016e-4ff4-a801-dab62c30ca70\") " pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.502213 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.502191 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54qf6\" (UniqueName: \"kubernetes.io/projected/84abdb7e-6fd5-451b-ace1-130db696d178-kube-api-access-54qf6\") pod \"insights-runtime-extractor-shshw\" (UID: \"84abdb7e-6fd5-451b-ace1-130db696d178\") " pod="openshift-insights/insights-runtime-extractor-shshw" Apr 16 18:04:52.609964 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.609914 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-shshw" Apr 16 18:04:52.659215 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.659188 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:52.737692 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.737662 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-shshw"] Apr 16 18:04:52.742161 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:04:52.742133 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84abdb7e_6fd5_451b_ace1_130db696d178.slice/crio-98d4c521a7ec4a7c47237e6760be52b2a65525a70a0229f6579ea8cb992676a7 WatchSource:0}: Error finding container 98d4c521a7ec4a7c47237e6760be52b2a65525a70a0229f6579ea8cb992676a7: Status 404 returned error can't find the container with id 98d4c521a7ec4a7c47237e6760be52b2a65525a70a0229f6579ea8cb992676a7 Apr 16 18:04:52.746105 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.746078 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-shshw" event={"ID":"84abdb7e-6fd5-451b-ace1-130db696d178","Type":"ContainerStarted","Data":"98d4c521a7ec4a7c47237e6760be52b2a65525a70a0229f6579ea8cb992676a7"} Apr 16 18:04:52.798199 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:52.798177 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5ddc6d5d76-tvmvt"] Apr 16 18:04:52.801147 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:04:52.801120 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97b7c142_016e_4ff4_a801_dab62c30ca70.slice/crio-1c7c5f532347c757f67608fc43124a37644b317c2cf25737f0bc983c597015cb WatchSource:0}: Error finding container 1c7c5f532347c757f67608fc43124a37644b317c2cf25737f0bc983c597015cb: Status 404 returned error can't find the container with id 1c7c5f532347c757f67608fc43124a37644b317c2cf25737f0bc983c597015cb Apr 16 18:04:53.721711 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:53.721684 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-btqpr" Apr 16 18:04:53.751861 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:53.751815 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-shshw" event={"ID":"84abdb7e-6fd5-451b-ace1-130db696d178","Type":"ContainerStarted","Data":"35594f44d1c95937cbbe4ae62dca4339bef88a299d8602751d82c2f44f2005a8"} Apr 16 18:04:53.752017 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:53.751872 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-shshw" event={"ID":"84abdb7e-6fd5-451b-ace1-130db696d178","Type":"ContainerStarted","Data":"14d05b1b7ee77910f1056e1f0af4c4957625d10f1f1bf533ad9fdfde3c009fd8"} Apr 16 18:04:53.753630 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:53.753601 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" event={"ID":"97b7c142-016e-4ff4-a801-dab62c30ca70","Type":"ContainerStarted","Data":"954b16ab72f809d4b1b1a5a65eb4d21b9ab888c56fd5961943ea620c42337e08"} Apr 16 18:04:53.753755 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:53.753637 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" event={"ID":"97b7c142-016e-4ff4-a801-dab62c30ca70","Type":"ContainerStarted","Data":"1c7c5f532347c757f67608fc43124a37644b317c2cf25737f0bc983c597015cb"} Apr 16 18:04:53.753817 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:53.753782 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:04:53.774820 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:53.774774 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" podStartSLOduration=1.774756569 podStartE2EDuration="1.774756569s" podCreationTimestamp="2026-04-16 18:04:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:04:53.77333216 +0000 UTC m=+174.050746055" watchObservedRunningTime="2026-04-16 18:04:53.774756569 +0000 UTC m=+174.052170464" Apr 16 18:04:54.757629 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:54.757599 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-shshw" event={"ID":"84abdb7e-6fd5-451b-ace1-130db696d178","Type":"ContainerStarted","Data":"e0566814577eb722e343ff8f1fa853d2dab6defe8d725462e221c16ce12f7879"} Apr 16 18:04:54.777326 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:54.777281 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-shshw" podStartSLOduration=0.90859861 podStartE2EDuration="2.777268538s" podCreationTimestamp="2026-04-16 18:04:52 +0000 UTC" firstStartedPulling="2026-04-16 18:04:52.791205852 +0000 UTC m=+173.068619725" lastFinishedPulling="2026-04-16 18:04:54.659875778 +0000 UTC m=+174.937289653" observedRunningTime="2026-04-16 18:04:54.776516446 +0000 UTC m=+175.053930350" watchObservedRunningTime="2026-04-16 18:04:54.777268538 +0000 UTC m=+175.054682484" Apr 16 18:04:55.308711 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:55.308671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-kgn8f\" (UID: \"5e4633a3-2eb6-4ec0-af69-c94c4487082b\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f" Apr 16 18:04:55.311182 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:55.311156 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5e4633a3-2eb6-4ec0-af69-c94c4487082b-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-kgn8f\" (UID: \"5e4633a3-2eb6-4ec0-af69-c94c4487082b\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f" Apr 16 18:04:55.517966 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:55.517928 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f" Apr 16 18:04:55.641209 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:55.641176 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f"] Apr 16 18:04:55.644490 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:04:55.644459 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e4633a3_2eb6_4ec0_af69_c94c4487082b.slice/crio-d051c3c5556320f03d0acd43ac91472f31f8fcfe6708dc6d3115c550087d7973 WatchSource:0}: Error finding container d051c3c5556320f03d0acd43ac91472f31f8fcfe6708dc6d3115c550087d7973: Status 404 returned error can't find the container with id d051c3c5556320f03d0acd43ac91472f31f8fcfe6708dc6d3115c550087d7973 Apr 16 18:04:55.760742 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:55.760699 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f" event={"ID":"5e4633a3-2eb6-4ec0-af69-c94c4487082b","Type":"ContainerStarted","Data":"d051c3c5556320f03d0acd43ac91472f31f8fcfe6708dc6d3115c550087d7973"} Apr 16 18:04:56.764702 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:56.764676 2574 generic.go:358] "Generic (PLEG): container finished" podID="665ebed3-4568-4fa2-8182-61d33ca28db9" containerID="65f80bc0dc8fd3894595bac0c1c37d7dcabacbed740774f7c14412c258c26da0" exitCode=255 Apr 16 18:04:56.765059 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:56.764760 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9" event={"ID":"665ebed3-4568-4fa2-8182-61d33ca28db9","Type":"ContainerDied","Data":"65f80bc0dc8fd3894595bac0c1c37d7dcabacbed740774f7c14412c258c26da0"} Apr 16 18:04:56.766288 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:56.766263 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f" event={"ID":"5e4633a3-2eb6-4ec0-af69-c94c4487082b","Type":"ContainerStarted","Data":"ad7aaf027ad32a0b36b3fd12df0271d871f917fd6cc25aff5e075f34460898f1"} Apr 16 18:04:56.770531 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:56.770512 2574 scope.go:117] "RemoveContainer" containerID="65f80bc0dc8fd3894595bac0c1c37d7dcabacbed740774f7c14412c258c26da0" Apr 16 18:04:56.803289 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:56.803251 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-kgn8f" podStartSLOduration=32.756644885 podStartE2EDuration="33.803235666s" podCreationTimestamp="2026-04-16 18:04:23 +0000 UTC" firstStartedPulling="2026-04-16 18:04:55.646277826 +0000 UTC m=+175.923691698" lastFinishedPulling="2026-04-16 18:04:56.692868592 +0000 UTC m=+176.970282479" observedRunningTime="2026-04-16 18:04:56.802283647 +0000 UTC m=+177.079697541" watchObservedRunningTime="2026-04-16 18:04:56.803235666 +0000 UTC m=+177.080649616" Apr 16 18:04:57.770735 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:04:57.770705 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b56bc6f89-rdgd9" event={"ID":"665ebed3-4568-4fa2-8182-61d33ca28db9","Type":"ContainerStarted","Data":"f01d2993dcaa7fa796f7ffca2d58117527870dc57a61c618c7deb65a6279ca5b"} Apr 16 18:05:00.232593 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.232567 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn"] Apr 16 18:05:00.235183 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.235165 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" Apr 16 18:05:00.237390 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.237343 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-lhln5\"" Apr 16 18:05:00.237390 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.237388 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:05:00.237547 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.237397 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 18:05:00.237708 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.237684 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:05:00.253581 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.253561 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn"] Apr 16 18:05:00.268407 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.268354 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-kxqds"] Apr 16 18:05:00.270646 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.270628 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.272856 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.272564 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:05:00.272856 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.272681 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:05:00.272856 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.272771 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:05:00.272856 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.272781 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s82h7\"" Apr 16 18:05:00.342849 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.342826 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-metrics-client-ca\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.342982 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.342864 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-root\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.342982 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.342892 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/791a4644-5779-4690-b410-ae72df65f3bd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-bh2pn\" (UID: \"791a4644-5779-4690-b410-ae72df65f3bd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" Apr 16 18:05:00.342982 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.342956 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/791a4644-5779-4690-b410-ae72df65f3bd-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-bh2pn\" (UID: \"791a4644-5779-4690-b410-ae72df65f3bd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" Apr 16 18:05:00.343121 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.343006 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg8f8\" (UniqueName: \"kubernetes.io/projected/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-kube-api-access-xg8f8\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.343121 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.343031 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/791a4644-5779-4690-b410-ae72df65f3bd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-bh2pn\" (UID: \"791a4644-5779-4690-b410-ae72df65f3bd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" Apr 16 18:05:00.343121 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.343047 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mchnv\" (UniqueName: \"kubernetes.io/projected/791a4644-5779-4690-b410-ae72df65f3bd-kube-api-access-mchnv\") pod \"openshift-state-metrics-5669946b84-bh2pn\" (UID: \"791a4644-5779-4690-b410-ae72df65f3bd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" Apr 16 18:05:00.343121 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.343077 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-node-exporter-tls\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.343276 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.343136 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-sys\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.343276 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.343179 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-node-exporter-accelerators-collector-config\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.343276 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.343202 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-node-exporter-wtmp\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.343276 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.343228 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-node-exporter-textfile\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.343276 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.343257 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.444458 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.444432 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-metrics-client-ca\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.444710 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.444471 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-root\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.444710 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.444490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/791a4644-5779-4690-b410-ae72df65f3bd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-bh2pn\" (UID: \"791a4644-5779-4690-b410-ae72df65f3bd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" Apr 16 18:05:00.444710 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.444508 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/791a4644-5779-4690-b410-ae72df65f3bd-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-bh2pn\" (UID: \"791a4644-5779-4690-b410-ae72df65f3bd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" Apr 16 18:05:00.444710 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.444530 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xg8f8\" (UniqueName: \"kubernetes.io/projected/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-kube-api-access-xg8f8\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.444710 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.444555 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/791a4644-5779-4690-b410-ae72df65f3bd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-bh2pn\" (UID: \"791a4644-5779-4690-b410-ae72df65f3bd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" Apr 16 18:05:00.444710 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.444556 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-root\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.444710 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.444584 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mchnv\" (UniqueName: \"kubernetes.io/projected/791a4644-5779-4690-b410-ae72df65f3bd-kube-api-access-mchnv\") pod \"openshift-state-metrics-5669946b84-bh2pn\" (UID: \"791a4644-5779-4690-b410-ae72df65f3bd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" Apr 16 18:05:00.444710 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.444615 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-node-exporter-tls\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.444710 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.444674 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-sys\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.444710 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.444708 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-node-exporter-accelerators-collector-config\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.445241 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.444742 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-node-exporter-wtmp\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.445241 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.444771 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-node-exporter-textfile\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.445241 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.444800 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.445241 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.444831 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-sys\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.445241 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.444997 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-node-exporter-wtmp\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.445241 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.445071 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-metrics-client-ca\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.445564 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.445360 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-node-exporter-textfile\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.445564 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.445513 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-node-exporter-accelerators-collector-config\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.445859 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.445837 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/791a4644-5779-4690-b410-ae72df65f3bd-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-bh2pn\" (UID: \"791a4644-5779-4690-b410-ae72df65f3bd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" Apr 16 18:05:00.447125 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.447104 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/791a4644-5779-4690-b410-ae72df65f3bd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-bh2pn\" (UID: \"791a4644-5779-4690-b410-ae72df65f3bd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" Apr 16 18:05:00.447464 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.447432 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.447612 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.447598 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-node-exporter-tls\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.447654 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.447611 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/791a4644-5779-4690-b410-ae72df65f3bd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-bh2pn\" (UID: \"791a4644-5779-4690-b410-ae72df65f3bd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" Apr 16 18:05:00.452461 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.452439 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg8f8\" (UniqueName: \"kubernetes.io/projected/f65b5dc6-1f16-4b6a-8748-9fe5615c9e68-kube-api-access-xg8f8\") pod \"node-exporter-kxqds\" (UID: \"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68\") " pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.452662 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.452647 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mchnv\" (UniqueName: \"kubernetes.io/projected/791a4644-5779-4690-b410-ae72df65f3bd-kube-api-access-mchnv\") pod \"openshift-state-metrics-5669946b84-bh2pn\" (UID: \"791a4644-5779-4690-b410-ae72df65f3bd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" Apr 16 18:05:00.545362 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.545306 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" Apr 16 18:05:00.581139 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.581109 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kxqds" Apr 16 18:05:00.596108 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:05:00.596060 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf65b5dc6_1f16_4b6a_8748_9fe5615c9e68.slice/crio-514c45ad87afba2c3f004b6fa7b808ca147819c75b14d4c8ff47b660b7ec6e32 WatchSource:0}: Error finding container 514c45ad87afba2c3f004b6fa7b808ca147819c75b14d4c8ff47b660b7ec6e32: Status 404 returned error can't find the container with id 514c45ad87afba2c3f004b6fa7b808ca147819c75b14d4c8ff47b660b7ec6e32 Apr 16 18:05:00.676596 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.676552 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn"] Apr 16 18:05:00.679695 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:05:00.679671 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod791a4644_5779_4690_b410_ae72df65f3bd.slice/crio-7923377a76621af3d26b44f6bd51c7c29f6f10ad4c6db0a2796963c6e86cb080 WatchSource:0}: Error finding container 7923377a76621af3d26b44f6bd51c7c29f6f10ad4c6db0a2796963c6e86cb080: Status 404 returned error can't find the container with id 7923377a76621af3d26b44f6bd51c7c29f6f10ad4c6db0a2796963c6e86cb080 Apr 16 18:05:00.780565 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.780542 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kxqds" event={"ID":"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68","Type":"ContainerStarted","Data":"514c45ad87afba2c3f004b6fa7b808ca147819c75b14d4c8ff47b660b7ec6e32"} Apr 16 18:05:00.781910 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.781889 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" event={"ID":"791a4644-5779-4690-b410-ae72df65f3bd","Type":"ContainerStarted","Data":"5c853d900155081beb73c800055071fee4c1ede4c897f48b59e1b99922438780"} Apr 16 18:05:00.781997 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:00.781918 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" event={"ID":"791a4644-5779-4690-b410-ae72df65f3bd","Type":"ContainerStarted","Data":"7923377a76621af3d26b44f6bd51c7c29f6f10ad4c6db0a2796963c6e86cb080"} Apr 16 18:05:01.371393 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.371352 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:05:01.374538 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.374359 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.376461 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.376435 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:05:01.377433 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.376910 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:05:01.377433 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.377084 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:05:01.377433 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.377406 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:05:01.377715 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.377697 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-sgb56\"" Apr 16 18:05:01.378716 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.377969 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:05:01.378716 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.378174 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:05:01.378716 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.378190 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:05:01.378716 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.378315 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:05:01.378716 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.378531 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:05:01.391479 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.391401 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:05:01.455412 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.455385 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/389d15fc-b614-4159-bec8-16bc0961e897-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.455516 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.455444 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/389d15fc-b614-4159-bec8-16bc0961e897-tls-assets\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.455516 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.455471 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/389d15fc-b614-4159-bec8-16bc0961e897-config-out\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.455629 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.455513 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnxw5\" (UniqueName: \"kubernetes.io/projected/389d15fc-b614-4159-bec8-16bc0961e897-kube-api-access-lnxw5\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.455629 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.455581 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.455629 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.455609 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-web-config\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.455758 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.455636 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.455758 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.455689 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.455835 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.455800 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-config-volume\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.455835 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.455827 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.455927 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.455853 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/389d15fc-b614-4159-bec8-16bc0961e897-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.455927 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.455897 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.456023 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.455929 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/389d15fc-b614-4159-bec8-16bc0961e897-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.556669 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.556602 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/389d15fc-b614-4159-bec8-16bc0961e897-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.556669 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.556651 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/389d15fc-b614-4159-bec8-16bc0961e897-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.556832 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.556698 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/389d15fc-b614-4159-bec8-16bc0961e897-tls-assets\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.556832 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.556725 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/389d15fc-b614-4159-bec8-16bc0961e897-config-out\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.556832 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.556772 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnxw5\" (UniqueName: \"kubernetes.io/projected/389d15fc-b614-4159-bec8-16bc0961e897-kube-api-access-lnxw5\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.556832 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.556808 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.556974 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.556837 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-web-config\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.556974 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.556870 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.556974 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.556897 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.556974 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.556951 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-config-volume\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.557122 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.556978 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.557122 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.557015 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/389d15fc-b614-4159-bec8-16bc0961e897-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.557122 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.557059 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.557122 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.557058 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/389d15fc-b614-4159-bec8-16bc0961e897-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.558215 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.557583 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/389d15fc-b614-4159-bec8-16bc0961e897-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.558215 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.558170 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/389d15fc-b614-4159-bec8-16bc0961e897-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.558459 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:05:01.558439 2574 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 18:05:01.558531 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:05:01.558518 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-main-tls podName:389d15fc-b614-4159-bec8-16bc0961e897 nodeName:}" failed. No retries permitted until 2026-04-16 18:05:02.058498353 +0000 UTC m=+182.335912229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "389d15fc-b614-4159-bec8-16bc0961e897") : secret "alertmanager-main-tls" not found Apr 16 18:05:01.562021 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.561954 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/389d15fc-b614-4159-bec8-16bc0961e897-config-out\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.562171 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.562119 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.562793 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.562718 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-web-config\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.563212 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.563162 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/389d15fc-b614-4159-bec8-16bc0961e897-tls-assets\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.564050 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.564029 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.564464 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.564420 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.564758 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.564729 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.564839 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.564790 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-config-volume\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.569275 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.569239 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnxw5\" (UniqueName: \"kubernetes.io/projected/389d15fc-b614-4159-bec8-16bc0961e897-kube-api-access-lnxw5\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:01.786564 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.786538 2574 generic.go:358] "Generic (PLEG): container finished" podID="f65b5dc6-1f16-4b6a-8748-9fe5615c9e68" containerID="dd530799ae80c9f45e246628ee7717b718b6567e40e6a5a959ae8af10be8fb0a" exitCode=0 Apr 16 18:05:01.786668 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.786645 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kxqds" event={"ID":"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68","Type":"ContainerDied","Data":"dd530799ae80c9f45e246628ee7717b718b6567e40e6a5a959ae8af10be8fb0a"} Apr 16 18:05:01.788873 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:01.788843 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" event={"ID":"791a4644-5779-4690-b410-ae72df65f3bd","Type":"ContainerStarted","Data":"f86b6c27a159f51469d3f14b6e089c85f72041806876e2f10510b5c3ff6aa10b"} Apr 16 18:05:02.061931 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:02.061899 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:02.064238 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:02.064209 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:02.286760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:02.286738 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:02.302637 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:02.302614 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:05:02.427808 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:02.427768 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:05:02.430148 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:05:02.430117 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod389d15fc_b614_4159_bec8_16bc0961e897.slice/crio-7d048b59318ce3d93d06dc34378ca7acf7a372182fe267d4f53e34ae2ad083f7 WatchSource:0}: Error finding container 7d048b59318ce3d93d06dc34378ca7acf7a372182fe267d4f53e34ae2ad083f7: Status 404 returned error can't find the container with id 7d048b59318ce3d93d06dc34378ca7acf7a372182fe267d4f53e34ae2ad083f7 Apr 16 18:05:02.792902 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:02.792870 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"389d15fc-b614-4159-bec8-16bc0961e897","Type":"ContainerStarted","Data":"7d048b59318ce3d93d06dc34378ca7acf7a372182fe267d4f53e34ae2ad083f7"} Apr 16 18:05:02.794861 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:02.794834 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kxqds" event={"ID":"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68","Type":"ContainerStarted","Data":"9f33e9020aaf3b2bf77a927de25542143d4156bfb271d8b7a14ac4364e12c749"} Apr 16 18:05:02.794981 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:02.794866 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kxqds" event={"ID":"f65b5dc6-1f16-4b6a-8748-9fe5615c9e68","Type":"ContainerStarted","Data":"0e4c2f57feedad9a557d0eb7e006aa113b9b99c1eedb5f523423b4ff51e4a2b9"} Apr 16 18:05:02.796513 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:02.796489 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" event={"ID":"791a4644-5779-4690-b410-ae72df65f3bd","Type":"ContainerStarted","Data":"9b84abdef57784941f2d55ede94838bcaf8ab7b14a86924ccdc93c9d6ed63e54"} Apr 16 18:05:02.811636 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:02.811597 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-kxqds" podStartSLOduration=2.160228342 podStartE2EDuration="2.811583282s" podCreationTimestamp="2026-04-16 18:05:00 +0000 UTC" firstStartedPulling="2026-04-16 18:05:00.595259744 +0000 UTC m=+180.872673622" lastFinishedPulling="2026-04-16 18:05:01.246614676 +0000 UTC m=+181.524028562" observedRunningTime="2026-04-16 18:05:02.810697696 +0000 UTC m=+183.088111589" watchObservedRunningTime="2026-04-16 18:05:02.811583282 +0000 UTC m=+183.088997176" Apr 16 18:05:02.829152 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:02.829113 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-bh2pn" podStartSLOduration=1.858154418 podStartE2EDuration="2.829100898s" podCreationTimestamp="2026-04-16 18:05:00 +0000 UTC" firstStartedPulling="2026-04-16 18:05:00.798839816 +0000 UTC m=+181.076253696" lastFinishedPulling="2026-04-16 18:05:01.769786286 +0000 UTC m=+182.047200176" observedRunningTime="2026-04-16 18:05:02.827658152 +0000 UTC m=+183.105072048" watchObservedRunningTime="2026-04-16 18:05:02.829100898 +0000 UTC m=+183.106514791" Apr 16 18:05:03.358900 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.358873 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-566b7d494-ppmb4"] Apr 16 18:05:03.379616 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.379596 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-566b7d494-ppmb4"] Apr 16 18:05:03.379730 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.379718 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.381989 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.381673 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 18:05:03.381989 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.381843 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-7nf6rbm5nismi\"" Apr 16 18:05:03.381989 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.381897 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 18:05:03.382203 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.382018 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 18:05:03.382203 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.382079 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-k89qx\"" Apr 16 18:05:03.382330 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.382310 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 18:05:03.382407 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.382345 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 18:05:03.475855 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.475829 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.476141 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.475875 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.476141 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.475915 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.476141 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.475937 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.476141 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.475964 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-grpc-tls\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.476141 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.476021 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fq4m\" (UniqueName: \"kubernetes.io/projected/3871267c-4368-4f90-9ba9-a4d4b8345f94-kube-api-access-4fq4m\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.476141 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.476105 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-thanos-querier-tls\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.476141 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.476133 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3871267c-4368-4f90-9ba9-a4d4b8345f94-metrics-client-ca\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.576737 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.576714 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.576829 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.576763 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.576893 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.576874 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.576942 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.576905 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.576942 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.576922 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-grpc-tls\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.577059 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.576943 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fq4m\" (UniqueName: \"kubernetes.io/projected/3871267c-4368-4f90-9ba9-a4d4b8345f94-kube-api-access-4fq4m\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.577454 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.577102 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-thanos-querier-tls\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.577454 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.577153 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3871267c-4368-4f90-9ba9-a4d4b8345f94-metrics-client-ca\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.577950 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.577923 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3871267c-4368-4f90-9ba9-a4d4b8345f94-metrics-client-ca\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.579317 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.579291 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.579770 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.579720 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.579770 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.579766 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.579926 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.579894 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.580003 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.579982 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-grpc-tls\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.580150 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.580126 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3871267c-4368-4f90-9ba9-a4d4b8345f94-secret-thanos-querier-tls\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.584182 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.584162 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fq4m\" (UniqueName: \"kubernetes.io/projected/3871267c-4368-4f90-9ba9-a4d4b8345f94-kube-api-access-4fq4m\") pod \"thanos-querier-566b7d494-ppmb4\" (UID: \"3871267c-4368-4f90-9ba9-a4d4b8345f94\") " pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.705203 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.705182 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:03.801816 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.801783 2574 generic.go:358] "Generic (PLEG): container finished" podID="389d15fc-b614-4159-bec8-16bc0961e897" containerID="e600c1290b0541b6ccddfb2f2984e3bc0c577e520abcbc91b8be3e5a4cd3f32a" exitCode=0 Apr 16 18:05:03.801938 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.801829 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"389d15fc-b614-4159-bec8-16bc0961e897","Type":"ContainerDied","Data":"e600c1290b0541b6ccddfb2f2984e3bc0c577e520abcbc91b8be3e5a4cd3f32a"} Apr 16 18:05:03.827400 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:03.827357 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-566b7d494-ppmb4"] Apr 16 18:05:03.830345 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:05:03.830323 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3871267c_4368_4f90_9ba9_a4d4b8345f94.slice/crio-7bb26e357f5843e01cd2bc19ec231eba1ac287a1bcfb47f1b8ba8062060903a3 WatchSource:0}: Error finding container 7bb26e357f5843e01cd2bc19ec231eba1ac287a1bcfb47f1b8ba8062060903a3: Status 404 returned error can't find the container with id 7bb26e357f5843e01cd2bc19ec231eba1ac287a1bcfb47f1b8ba8062060903a3 Apr 16 18:05:04.592240 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.592209 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-fc9cdddd7-svpwj"] Apr 16 18:05:04.594799 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.594771 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.596942 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.596923 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 18:05:04.597445 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.597424 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-2urnfgo8r22lm\"" Apr 16 18:05:04.597559 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.597517 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:05:04.597631 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.597556 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 18:05:04.597726 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.597626 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-pwhct\"" Apr 16 18:05:04.597726 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.597690 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 18:05:04.607274 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.607253 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-fc9cdddd7-svpwj"] Apr 16 18:05:04.686236 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.686205 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/44d651ae-7188-4e4e-af07-3492f11dc279-metrics-server-audit-profiles\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.686441 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.686291 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44d651ae-7188-4e4e-af07-3492f11dc279-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.686441 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.686327 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/44d651ae-7188-4e4e-af07-3492f11dc279-audit-log\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.686441 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.686381 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/44d651ae-7188-4e4e-af07-3492f11dc279-secret-metrics-server-tls\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.686441 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.686417 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6qpr\" (UniqueName: \"kubernetes.io/projected/44d651ae-7188-4e4e-af07-3492f11dc279-kube-api-access-c6qpr\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.686687 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.686460 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/44d651ae-7188-4e4e-af07-3492f11dc279-secret-metrics-server-client-certs\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.686687 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.686549 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d651ae-7188-4e4e-af07-3492f11dc279-client-ca-bundle\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.787607 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.787580 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d651ae-7188-4e4e-af07-3492f11dc279-client-ca-bundle\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.787749 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.787664 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/44d651ae-7188-4e4e-af07-3492f11dc279-metrics-server-audit-profiles\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.787749 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.787697 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44d651ae-7188-4e4e-af07-3492f11dc279-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.787749 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.787722 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/44d651ae-7188-4e4e-af07-3492f11dc279-audit-log\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.787910 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.787753 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/44d651ae-7188-4e4e-af07-3492f11dc279-secret-metrics-server-tls\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.787910 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.787788 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6qpr\" (UniqueName: \"kubernetes.io/projected/44d651ae-7188-4e4e-af07-3492f11dc279-kube-api-access-c6qpr\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.787910 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.787835 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/44d651ae-7188-4e4e-af07-3492f11dc279-secret-metrics-server-client-certs\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.788814 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.788684 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/44d651ae-7188-4e4e-af07-3492f11dc279-audit-log\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.788814 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.788772 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44d651ae-7188-4e4e-af07-3492f11dc279-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.788968 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.788881 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/44d651ae-7188-4e4e-af07-3492f11dc279-metrics-server-audit-profiles\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.791062 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.790962 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d651ae-7188-4e4e-af07-3492f11dc279-client-ca-bundle\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.791164 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.791105 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/44d651ae-7188-4e4e-af07-3492f11dc279-secret-metrics-server-tls\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.791164 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.791105 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/44d651ae-7188-4e4e-af07-3492f11dc279-secret-metrics-server-client-certs\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.797184 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.797165 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6qpr\" (UniqueName: \"kubernetes.io/projected/44d651ae-7188-4e4e-af07-3492f11dc279-kube-api-access-c6qpr\") pod \"metrics-server-fc9cdddd7-svpwj\" (UID: \"44d651ae-7188-4e4e-af07-3492f11dc279\") " pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:04.807130 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.807100 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" event={"ID":"3871267c-4368-4f90-9ba9-a4d4b8345f94","Type":"ContainerStarted","Data":"7bb26e357f5843e01cd2bc19ec231eba1ac287a1bcfb47f1b8ba8062060903a3"} Apr 16 18:05:04.909585 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:04.909476 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:05.340574 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:05.340284 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-fc9cdddd7-svpwj"] Apr 16 18:05:05.811870 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:05.811830 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" event={"ID":"44d651ae-7188-4e4e-af07-3492f11dc279","Type":"ContainerStarted","Data":"8799aa80de4009c98cbe65752f06b0da4f9ad8a890c49975093ef2e1c2b6c27d"} Apr 16 18:05:05.814020 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:05.813996 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"389d15fc-b614-4159-bec8-16bc0961e897","Type":"ContainerStarted","Data":"23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f"} Apr 16 18:05:05.814129 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:05.814024 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"389d15fc-b614-4159-bec8-16bc0961e897","Type":"ContainerStarted","Data":"99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92"} Apr 16 18:05:06.821274 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:06.821241 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"389d15fc-b614-4159-bec8-16bc0961e897","Type":"ContainerStarted","Data":"66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e"} Apr 16 18:05:06.821631 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:06.821282 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"389d15fc-b614-4159-bec8-16bc0961e897","Type":"ContainerStarted","Data":"616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d"} Apr 16 18:05:06.821631 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:06.821295 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"389d15fc-b614-4159-bec8-16bc0961e897","Type":"ContainerStarted","Data":"47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955"} Apr 16 18:05:06.824968 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:06.824940 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" event={"ID":"3871267c-4368-4f90-9ba9-a4d4b8345f94","Type":"ContainerStarted","Data":"66f315651dbd95bc791486567abf92885e95a4809a4c0a0a44fbd70720f5f17b"} Apr 16 18:05:06.825083 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:06.824978 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" event={"ID":"3871267c-4368-4f90-9ba9-a4d4b8345f94","Type":"ContainerStarted","Data":"143e699b8fe4186b87c16350e2c8f66acdf1d25203323bd12f2e71c82140474d"} Apr 16 18:05:06.825083 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:06.825001 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" event={"ID":"3871267c-4368-4f90-9ba9-a4d4b8345f94","Type":"ContainerStarted","Data":"4f2985868593738c2735efd5e0fcddd907295039ca0818ba094727070874c619"} Apr 16 18:05:07.829933 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:07.829898 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" event={"ID":"44d651ae-7188-4e4e-af07-3492f11dc279","Type":"ContainerStarted","Data":"9a943eb7d1dab0ff2d7e3599993fb2d856e3744901b21a5ba21628e28f8e2500"} Apr 16 18:05:07.832687 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:07.832662 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"389d15fc-b614-4159-bec8-16bc0961e897","Type":"ContainerStarted","Data":"ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d"} Apr 16 18:05:07.834868 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:07.834850 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" event={"ID":"3871267c-4368-4f90-9ba9-a4d4b8345f94","Type":"ContainerStarted","Data":"1f72b668516ba07e62f2346d1b2a85a39dc6a4431e8182e227de4997d4118331"} Apr 16 18:05:07.834955 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:07.834872 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" event={"ID":"3871267c-4368-4f90-9ba9-a4d4b8345f94","Type":"ContainerStarted","Data":"e07890844ad4b17e050060f489bf7bf000129b2d1d4753044b003d4dfff1acab"} Apr 16 18:05:07.834955 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:07.834882 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" event={"ID":"3871267c-4368-4f90-9ba9-a4d4b8345f94","Type":"ContainerStarted","Data":"945e3280653aa836eaac40eb50ad8dd90930abed42adb0b687d319ae98af2400"} Apr 16 18:05:07.835050 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:07.835033 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:07.860193 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:07.860141 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" podStartSLOduration=1.959929889 podStartE2EDuration="3.860125807s" podCreationTimestamp="2026-04-16 18:05:04 +0000 UTC" firstStartedPulling="2026-04-16 18:05:05.360168734 +0000 UTC m=+185.637582617" lastFinishedPulling="2026-04-16 18:05:07.260364663 +0000 UTC m=+187.537778535" observedRunningTime="2026-04-16 18:05:07.858981305 +0000 UTC m=+188.136395198" watchObservedRunningTime="2026-04-16 18:05:07.860125807 +0000 UTC m=+188.137539702" Apr 16 18:05:07.890574 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:07.890534 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" podStartSLOduration=1.942677376 podStartE2EDuration="4.890521617s" podCreationTimestamp="2026-04-16 18:05:03 +0000 UTC" firstStartedPulling="2026-04-16 18:05:03.831983294 +0000 UTC m=+184.109397167" lastFinishedPulling="2026-04-16 18:05:06.779827531 +0000 UTC m=+187.057241408" observedRunningTime="2026-04-16 18:05:07.889316193 +0000 UTC m=+188.166730086" watchObservedRunningTime="2026-04-16 18:05:07.890521617 +0000 UTC m=+188.167935560" Apr 16 18:05:07.937328 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:07.937269 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.59185183 podStartE2EDuration="6.937225888s" podCreationTimestamp="2026-04-16 18:05:01 +0000 UTC" firstStartedPulling="2026-04-16 18:05:02.432491357 +0000 UTC m=+182.709905232" lastFinishedPulling="2026-04-16 18:05:06.777865418 +0000 UTC m=+187.055279290" observedRunningTime="2026-04-16 18:05:07.935935137 +0000 UTC m=+188.213349035" watchObservedRunningTime="2026-04-16 18:05:07.937225888 +0000 UTC m=+188.214639785" Apr 16 18:05:13.845753 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:13.845712 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-566b7d494-ppmb4" Apr 16 18:05:14.762710 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:14.762670 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5ddc6d5d76-tvmvt" Apr 16 18:05:17.316836 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.316799 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" podUID="10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290" containerName="registry" containerID="cri-o://6d0d7072ad6e1d6e9e27b3234fe46447cb4849e4c8b46466dc656c4b53d70c3a" gracePeriod=30 Apr 16 18:05:17.555703 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.555683 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:05:17.690015 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.689984 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-image-registry-private-configuration\") pod \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " Apr 16 18:05:17.690196 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.690034 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-installation-pull-secrets\") pod \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " Apr 16 18:05:17.690196 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.690057 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6xkm\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-kube-api-access-b6xkm\") pod \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " Apr 16 18:05:17.690196 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.690095 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-trusted-ca\") pod \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " Apr 16 18:05:17.690196 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.690116 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-bound-sa-token\") pod \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " Apr 16 18:05:17.690421 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.690206 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-ca-trust-extracted\") pod \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " Apr 16 18:05:17.690421 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.690249 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls\") pod \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " Apr 16 18:05:17.690421 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.690289 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-certificates\") pod \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\" (UID: \"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290\") " Apr 16 18:05:17.690619 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.690594 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290" (UID: "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:17.690797 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.690779 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-trusted-ca\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:05:17.690963 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.690939 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290" (UID: "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:17.692585 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.692554 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290" (UID: "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:17.692709 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.692688 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-kube-api-access-b6xkm" (OuterVolumeSpecName: "kube-api-access-b6xkm") pod "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290" (UID: "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290"). InnerVolumeSpecName "kube-api-access-b6xkm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:17.692763 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.692695 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290" (UID: "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:17.692763 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.692708 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290" (UID: "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:17.693656 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.693631 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290" (UID: "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:17.698934 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.698909 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290" (UID: "10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:05:17.791275 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.791250 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-bound-sa-token\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:05:17.791275 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.791273 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-ca-trust-extracted\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:05:17.791432 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.791288 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-tls\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:05:17.791432 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.791300 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-registry-certificates\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:05:17.791432 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.791315 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-image-registry-private-configuration\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:05:17.791432 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.791329 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-installation-pull-secrets\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:05:17.791432 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.791343 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b6xkm\" (UniqueName: \"kubernetes.io/projected/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290-kube-api-access-b6xkm\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:05:17.869845 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.865320 2574 generic.go:358] "Generic (PLEG): container finished" podID="10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290" containerID="6d0d7072ad6e1d6e9e27b3234fe46447cb4849e4c8b46466dc656c4b53d70c3a" exitCode=0 Apr 16 18:05:17.869845 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.865489 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" event={"ID":"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290","Type":"ContainerDied","Data":"6d0d7072ad6e1d6e9e27b3234fe46447cb4849e4c8b46466dc656c4b53d70c3a"} Apr 16 18:05:17.869845 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.865524 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" event={"ID":"10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290","Type":"ContainerDied","Data":"5caeb20b9503bdee4368d320559074ae8b34ede63db04efe221c8876e9f7dfde"} Apr 16 18:05:17.869845 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.865558 2574 scope.go:117] "RemoveContainer" containerID="6d0d7072ad6e1d6e9e27b3234fe46447cb4849e4c8b46466dc656c4b53d70c3a" Apr 16 18:05:17.869845 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.865754 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54c8fc9545-l6f7w" Apr 16 18:05:17.878522 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.878507 2574 scope.go:117] "RemoveContainer" containerID="6d0d7072ad6e1d6e9e27b3234fe46447cb4849e4c8b46466dc656c4b53d70c3a" Apr 16 18:05:17.878771 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:05:17.878753 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d0d7072ad6e1d6e9e27b3234fe46447cb4849e4c8b46466dc656c4b53d70c3a\": container with ID starting with 6d0d7072ad6e1d6e9e27b3234fe46447cb4849e4c8b46466dc656c4b53d70c3a not found: ID does not exist" containerID="6d0d7072ad6e1d6e9e27b3234fe46447cb4849e4c8b46466dc656c4b53d70c3a" Apr 16 18:05:17.878840 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.878781 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0d7072ad6e1d6e9e27b3234fe46447cb4849e4c8b46466dc656c4b53d70c3a"} err="failed to get container status \"6d0d7072ad6e1d6e9e27b3234fe46447cb4849e4c8b46466dc656c4b53d70c3a\": rpc error: code = NotFound desc = could not find container \"6d0d7072ad6e1d6e9e27b3234fe46447cb4849e4c8b46466dc656c4b53d70c3a\": container with ID starting with 6d0d7072ad6e1d6e9e27b3234fe46447cb4849e4c8b46466dc656c4b53d70c3a not found: ID does not exist" Apr 16 18:05:17.890917 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.890896 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-54c8fc9545-l6f7w"] Apr 16 18:05:17.894308 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:17.894291 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-54c8fc9545-l6f7w"] Apr 16 18:05:18.294563 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:18.294536 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290" path="/var/lib/kubelet/pods/10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290/volumes" Apr 16 18:05:24.909612 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:24.909569 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:24.909612 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:24.909624 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:36.922951 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:36.922917 2574 generic.go:358] "Generic (PLEG): container finished" podID="35a6661a-e671-4470-a717-3ca52c801a37" containerID="4f7207e8b043fbc9d0835d4e3c32e210aaacf123479c2177a52f8f182d1b4137" exitCode=0 Apr 16 18:05:36.923293 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:36.922990 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86" event={"ID":"35a6661a-e671-4470-a717-3ca52c801a37","Type":"ContainerDied","Data":"4f7207e8b043fbc9d0835d4e3c32e210aaacf123479c2177a52f8f182d1b4137"} Apr 16 18:05:36.923336 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:36.923292 2574 scope.go:117] "RemoveContainer" containerID="4f7207e8b043fbc9d0835d4e3c32e210aaacf123479c2177a52f8f182d1b4137" Apr 16 18:05:37.927066 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:37.927034 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-rwh86" event={"ID":"35a6661a-e671-4470-a717-3ca52c801a37","Type":"ContainerStarted","Data":"d25e06d309dce4add1c177c11f69956c0d7df3bbbd612ea8d16d227e99eaabe9"} Apr 16 18:05:44.916047 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:44.916010 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:05:44.920112 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:05:44.920084 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-fc9cdddd7-svpwj" Apr 16 18:06:11.992420 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:11.992356 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs\") pod \"network-metrics-daemon-24pqv\" (UID: \"9d69a3dd-c0fd-4764-b3d8-802189b16640\") " pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:06:11.994718 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:11.994690 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d69a3dd-c0fd-4764-b3d8-802189b16640-metrics-certs\") pod \"network-metrics-daemon-24pqv\" (UID: \"9d69a3dd-c0fd-4764-b3d8-802189b16640\") " pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:06:12.295066 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:12.294982 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2dnp6\"" Apr 16 18:06:12.302499 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:12.302478 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pqv" Apr 16 18:06:12.416090 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:12.416060 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-24pqv"] Apr 16 18:06:12.419239 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:06:12.419209 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d69a3dd_c0fd_4764_b3d8_802189b16640.slice/crio-a3cd030b9ae4207e116a90a88bd00ddb2523c7af255515b1a4a8b5274ac9da04 WatchSource:0}: Error finding container a3cd030b9ae4207e116a90a88bd00ddb2523c7af255515b1a4a8b5274ac9da04: Status 404 returned error can't find the container with id a3cd030b9ae4207e116a90a88bd00ddb2523c7af255515b1a4a8b5274ac9da04 Apr 16 18:06:13.034900 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:13.034856 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-24pqv" event={"ID":"9d69a3dd-c0fd-4764-b3d8-802189b16640","Type":"ContainerStarted","Data":"a3cd030b9ae4207e116a90a88bd00ddb2523c7af255515b1a4a8b5274ac9da04"} Apr 16 18:06:14.039846 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:14.039810 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-24pqv" event={"ID":"9d69a3dd-c0fd-4764-b3d8-802189b16640","Type":"ContainerStarted","Data":"db81e10bf579424b2d66d25be07ad7314e018c7547ce6ad113f2479f909a085f"} Apr 16 18:06:14.040195 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:14.039852 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-24pqv" event={"ID":"9d69a3dd-c0fd-4764-b3d8-802189b16640","Type":"ContainerStarted","Data":"3e459e34f3063c31e0f24610e0413f022000463a9e5a988225efb2a5fe31f779"} Apr 16 18:06:14.058905 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:14.058859 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-24pqv" podStartSLOduration=252.981121953 podStartE2EDuration="4m14.058844192s" podCreationTimestamp="2026-04-16 18:02:00 +0000 UTC" firstStartedPulling="2026-04-16 18:06:12.421071732 +0000 UTC m=+252.698485604" lastFinishedPulling="2026-04-16 18:06:13.498793971 +0000 UTC m=+253.776207843" observedRunningTime="2026-04-16 18:06:14.057590831 +0000 UTC m=+254.335004723" watchObservedRunningTime="2026-04-16 18:06:14.058844192 +0000 UTC m=+254.336258085" Apr 16 18:06:20.589916 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:20.589835 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:06:20.590361 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:20.590295 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="alertmanager" containerID="cri-o://99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92" gracePeriod=120 Apr 16 18:06:20.590458 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:20.590355 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="kube-rbac-proxy-metric" containerID="cri-o://66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e" gracePeriod=120 Apr 16 18:06:20.590458 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:20.590388 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="prom-label-proxy" containerID="cri-o://ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d" gracePeriod=120 Apr 16 18:06:20.590458 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:20.590428 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="config-reloader" containerID="cri-o://23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f" gracePeriod=120 Apr 16 18:06:20.590458 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:20.590431 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="kube-rbac-proxy-web" containerID="cri-o://47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955" gracePeriod=120 Apr 16 18:06:20.590644 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:20.590495 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="kube-rbac-proxy" containerID="cri-o://616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d" gracePeriod=120 Apr 16 18:06:21.066325 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.066295 2574 generic.go:358] "Generic (PLEG): container finished" podID="389d15fc-b614-4159-bec8-16bc0961e897" containerID="ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d" exitCode=0 Apr 16 18:06:21.066325 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.066321 2574 generic.go:358] "Generic (PLEG): container finished" podID="389d15fc-b614-4159-bec8-16bc0961e897" containerID="616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d" exitCode=0 Apr 16 18:06:21.066325 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.066328 2574 generic.go:358] "Generic (PLEG): container finished" podID="389d15fc-b614-4159-bec8-16bc0961e897" containerID="23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f" exitCode=0 Apr 16 18:06:21.066545 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.066334 2574 generic.go:358] "Generic (PLEG): container finished" podID="389d15fc-b614-4159-bec8-16bc0961e897" containerID="99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92" exitCode=0 Apr 16 18:06:21.066545 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.066381 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"389d15fc-b614-4159-bec8-16bc0961e897","Type":"ContainerDied","Data":"ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d"} Apr 16 18:06:21.066545 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.066414 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"389d15fc-b614-4159-bec8-16bc0961e897","Type":"ContainerDied","Data":"616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d"} Apr 16 18:06:21.066545 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.066425 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"389d15fc-b614-4159-bec8-16bc0961e897","Type":"ContainerDied","Data":"23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f"} Apr 16 18:06:21.066545 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.066433 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"389d15fc-b614-4159-bec8-16bc0961e897","Type":"ContainerDied","Data":"99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92"} Apr 16 18:06:21.834969 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.834948 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:21.959999 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.959978 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnxw5\" (UniqueName: \"kubernetes.io/projected/389d15fc-b614-4159-bec8-16bc0961e897-kube-api-access-lnxw5\") pod \"389d15fc-b614-4159-bec8-16bc0961e897\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " Apr 16 18:06:21.960153 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.960032 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy-web\") pod \"389d15fc-b614-4159-bec8-16bc0961e897\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " Apr 16 18:06:21.960153 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.960058 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy-metric\") pod \"389d15fc-b614-4159-bec8-16bc0961e897\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " Apr 16 18:06:21.960153 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.960094 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-config-volume\") pod \"389d15fc-b614-4159-bec8-16bc0961e897\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " Apr 16 18:06:21.960153 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.960113 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/389d15fc-b614-4159-bec8-16bc0961e897-config-out\") pod \"389d15fc-b614-4159-bec8-16bc0961e897\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " Apr 16 18:06:21.960153 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.960143 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-web-config\") pod \"389d15fc-b614-4159-bec8-16bc0961e897\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " Apr 16 18:06:21.960433 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.960172 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-main-tls\") pod \"389d15fc-b614-4159-bec8-16bc0961e897\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " Apr 16 18:06:21.960433 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.960202 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-cluster-tls-config\") pod \"389d15fc-b614-4159-bec8-16bc0961e897\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " Apr 16 18:06:21.960433 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.960235 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy\") pod \"389d15fc-b614-4159-bec8-16bc0961e897\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " Apr 16 18:06:21.960433 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.960268 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/389d15fc-b614-4159-bec8-16bc0961e897-alertmanager-main-db\") pod \"389d15fc-b614-4159-bec8-16bc0961e897\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " Apr 16 18:06:21.960433 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.960337 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/389d15fc-b614-4159-bec8-16bc0961e897-alertmanager-trusted-ca-bundle\") pod \"389d15fc-b614-4159-bec8-16bc0961e897\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " Apr 16 18:06:21.960433 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.960406 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/389d15fc-b614-4159-bec8-16bc0961e897-tls-assets\") pod \"389d15fc-b614-4159-bec8-16bc0961e897\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " Apr 16 18:06:21.960433 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.960431 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/389d15fc-b614-4159-bec8-16bc0961e897-metrics-client-ca\") pod \"389d15fc-b614-4159-bec8-16bc0961e897\" (UID: \"389d15fc-b614-4159-bec8-16bc0961e897\") " Apr 16 18:06:21.961312 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.960983 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/389d15fc-b614-4159-bec8-16bc0961e897-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "389d15fc-b614-4159-bec8-16bc0961e897" (UID: "389d15fc-b614-4159-bec8-16bc0961e897"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:06:21.961630 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.961575 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/389d15fc-b614-4159-bec8-16bc0961e897-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "389d15fc-b614-4159-bec8-16bc0961e897" (UID: "389d15fc-b614-4159-bec8-16bc0961e897"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:06:21.961721 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.961696 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/389d15fc-b614-4159-bec8-16bc0961e897-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "389d15fc-b614-4159-bec8-16bc0961e897" (UID: "389d15fc-b614-4159-bec8-16bc0961e897"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:06:21.963229 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.963196 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-config-volume" (OuterVolumeSpecName: "config-volume") pod "389d15fc-b614-4159-bec8-16bc0961e897" (UID: "389d15fc-b614-4159-bec8-16bc0961e897"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:21.963888 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.963772 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "389d15fc-b614-4159-bec8-16bc0961e897" (UID: "389d15fc-b614-4159-bec8-16bc0961e897"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:21.963888 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.963817 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/389d15fc-b614-4159-bec8-16bc0961e897-config-out" (OuterVolumeSpecName: "config-out") pod "389d15fc-b614-4159-bec8-16bc0961e897" (UID: "389d15fc-b614-4159-bec8-16bc0961e897"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:06:21.963888 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.963854 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/389d15fc-b614-4159-bec8-16bc0961e897-kube-api-access-lnxw5" (OuterVolumeSpecName: "kube-api-access-lnxw5") pod "389d15fc-b614-4159-bec8-16bc0961e897" (UID: "389d15fc-b614-4159-bec8-16bc0961e897"). InnerVolumeSpecName "kube-api-access-lnxw5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:06:21.964302 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.964281 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "389d15fc-b614-4159-bec8-16bc0961e897" (UID: "389d15fc-b614-4159-bec8-16bc0961e897"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:21.964406 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.964313 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "389d15fc-b614-4159-bec8-16bc0961e897" (UID: "389d15fc-b614-4159-bec8-16bc0961e897"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:21.964548 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.964529 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "389d15fc-b614-4159-bec8-16bc0961e897" (UID: "389d15fc-b614-4159-bec8-16bc0961e897"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:21.965140 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.965120 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/389d15fc-b614-4159-bec8-16bc0961e897-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "389d15fc-b614-4159-bec8-16bc0961e897" (UID: "389d15fc-b614-4159-bec8-16bc0961e897"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:06:21.967917 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.967893 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "389d15fc-b614-4159-bec8-16bc0961e897" (UID: "389d15fc-b614-4159-bec8-16bc0961e897"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:21.975023 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:21.975004 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-web-config" (OuterVolumeSpecName: "web-config") pod "389d15fc-b614-4159-bec8-16bc0961e897" (UID: "389d15fc-b614-4159-bec8-16bc0961e897"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:22.061865 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.061844 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.061865 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.061866 2574 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-config-volume\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.061976 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.061877 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/389d15fc-b614-4159-bec8-16bc0961e897-config-out\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.061976 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.061886 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-web-config\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.061976 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.061895 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-main-tls\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.061976 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.061904 2574 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-cluster-tls-config\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.061976 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.061913 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.061976 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.061922 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/389d15fc-b614-4159-bec8-16bc0961e897-alertmanager-main-db\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.061976 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.061932 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/389d15fc-b614-4159-bec8-16bc0961e897-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.061976 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.061940 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/389d15fc-b614-4159-bec8-16bc0961e897-tls-assets\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.061976 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.061949 2574 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/389d15fc-b614-4159-bec8-16bc0961e897-metrics-client-ca\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.061976 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.061957 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lnxw5\" (UniqueName: \"kubernetes.io/projected/389d15fc-b614-4159-bec8-16bc0961e897-kube-api-access-lnxw5\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.061976 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.061966 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/389d15fc-b614-4159-bec8-16bc0961e897-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.071885 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.071862 2574 generic.go:358] "Generic (PLEG): container finished" podID="389d15fc-b614-4159-bec8-16bc0961e897" containerID="66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e" exitCode=0 Apr 16 18:06:22.071885 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.071883 2574 generic.go:358] "Generic (PLEG): container finished" podID="389d15fc-b614-4159-bec8-16bc0961e897" containerID="47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955" exitCode=0 Apr 16 18:06:22.072001 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.071942 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"389d15fc-b614-4159-bec8-16bc0961e897","Type":"ContainerDied","Data":"66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e"} Apr 16 18:06:22.072001 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.071961 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.072001 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.071981 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"389d15fc-b614-4159-bec8-16bc0961e897","Type":"ContainerDied","Data":"47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955"} Apr 16 18:06:22.072001 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.071994 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"389d15fc-b614-4159-bec8-16bc0961e897","Type":"ContainerDied","Data":"7d048b59318ce3d93d06dc34378ca7acf7a372182fe267d4f53e34ae2ad083f7"} Apr 16 18:06:22.072167 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.072008 2574 scope.go:117] "RemoveContainer" containerID="ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d" Apr 16 18:06:22.079674 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.079538 2574 scope.go:117] "RemoveContainer" containerID="66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e" Apr 16 18:06:22.086497 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.086480 2574 scope.go:117] "RemoveContainer" containerID="616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d" Apr 16 18:06:22.092481 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.092467 2574 scope.go:117] "RemoveContainer" containerID="47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955" Apr 16 18:06:22.098435 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.098412 2574 scope.go:117] "RemoveContainer" containerID="23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f" Apr 16 18:06:22.102722 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.102702 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:06:22.104693 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.104673 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:06:22.108195 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.108179 2574 scope.go:117] "RemoveContainer" containerID="99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92" Apr 16 18:06:22.114414 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.114397 2574 scope.go:117] "RemoveContainer" containerID="e600c1290b0541b6ccddfb2f2984e3bc0c577e520abcbc91b8be3e5a4cd3f32a" Apr 16 18:06:22.120356 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.120342 2574 scope.go:117] "RemoveContainer" containerID="ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d" Apr 16 18:06:22.120609 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:06:22.120591 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d\": container with ID starting with ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d not found: ID does not exist" containerID="ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d" Apr 16 18:06:22.120697 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.120619 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d"} err="failed to get container status \"ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d\": rpc error: code = NotFound desc = could not find container \"ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d\": container with ID starting with ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d not found: ID does not exist" Apr 16 18:06:22.120697 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.120642 2574 scope.go:117] "RemoveContainer" containerID="66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e" Apr 16 18:06:22.120878 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:06:22.120861 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e\": container with ID starting with 66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e not found: ID does not exist" containerID="66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e" Apr 16 18:06:22.120919 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.120884 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e"} err="failed to get container status \"66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e\": rpc error: code = NotFound desc = could not find container \"66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e\": container with ID starting with 66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e not found: ID does not exist" Apr 16 18:06:22.120919 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.120899 2574 scope.go:117] "RemoveContainer" containerID="616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d" Apr 16 18:06:22.121117 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:06:22.121102 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d\": container with ID starting with 616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d not found: ID does not exist" containerID="616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d" Apr 16 18:06:22.121160 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.121121 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d"} err="failed to get container status \"616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d\": rpc error: code = NotFound desc = could not find container \"616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d\": container with ID starting with 616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d not found: ID does not exist" Apr 16 18:06:22.121160 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.121136 2574 scope.go:117] "RemoveContainer" containerID="47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955" Apr 16 18:06:22.121335 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:06:22.121315 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955\": container with ID starting with 47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955 not found: ID does not exist" containerID="47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955" Apr 16 18:06:22.121406 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.121343 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955"} err="failed to get container status \"47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955\": rpc error: code = NotFound desc = could not find container \"47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955\": container with ID starting with 47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955 not found: ID does not exist" Apr 16 18:06:22.121406 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.121359 2574 scope.go:117] "RemoveContainer" containerID="23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f" Apr 16 18:06:22.121826 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:06:22.121808 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f\": container with ID starting with 23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f not found: ID does not exist" containerID="23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f" Apr 16 18:06:22.121912 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.121827 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f"} err="failed to get container status \"23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f\": rpc error: code = NotFound desc = could not find container \"23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f\": container with ID starting with 23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f not found: ID does not exist" Apr 16 18:06:22.121912 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.121841 2574 scope.go:117] "RemoveContainer" containerID="99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92" Apr 16 18:06:22.122027 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:06:22.122013 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92\": container with ID starting with 99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92 not found: ID does not exist" containerID="99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92" Apr 16 18:06:22.122066 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.122029 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92"} err="failed to get container status \"99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92\": rpc error: code = NotFound desc = could not find container \"99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92\": container with ID starting with 99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92 not found: ID does not exist" Apr 16 18:06:22.122066 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.122043 2574 scope.go:117] "RemoveContainer" containerID="e600c1290b0541b6ccddfb2f2984e3bc0c577e520abcbc91b8be3e5a4cd3f32a" Apr 16 18:06:22.122271 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:06:22.122255 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e600c1290b0541b6ccddfb2f2984e3bc0c577e520abcbc91b8be3e5a4cd3f32a\": container with ID starting with e600c1290b0541b6ccddfb2f2984e3bc0c577e520abcbc91b8be3e5a4cd3f32a not found: ID does not exist" containerID="e600c1290b0541b6ccddfb2f2984e3bc0c577e520abcbc91b8be3e5a4cd3f32a" Apr 16 18:06:22.122323 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.122275 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e600c1290b0541b6ccddfb2f2984e3bc0c577e520abcbc91b8be3e5a4cd3f32a"} err="failed to get container status \"e600c1290b0541b6ccddfb2f2984e3bc0c577e520abcbc91b8be3e5a4cd3f32a\": rpc error: code = NotFound desc = could not find container \"e600c1290b0541b6ccddfb2f2984e3bc0c577e520abcbc91b8be3e5a4cd3f32a\": container with ID starting with e600c1290b0541b6ccddfb2f2984e3bc0c577e520abcbc91b8be3e5a4cd3f32a not found: ID does not exist" Apr 16 18:06:22.122323 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.122297 2574 scope.go:117] "RemoveContainer" containerID="ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d" Apr 16 18:06:22.122507 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.122492 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d"} err="failed to get container status \"ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d\": rpc error: code = NotFound desc = could not find container \"ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d\": container with ID starting with ce1e3cc4d6da0734ce0de13bbc77c026194fc386e09a35cad504f0c6759d037d not found: ID does not exist" Apr 16 18:06:22.122555 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.122508 2574 scope.go:117] "RemoveContainer" containerID="66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e" Apr 16 18:06:22.122709 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.122689 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e"} err="failed to get container status \"66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e\": rpc error: code = NotFound desc = could not find container \"66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e\": container with ID starting with 66a504d808d7047dcf270712ecffa0d6cc638312ad2f1de01304d1a35d25b39e not found: ID does not exist" Apr 16 18:06:22.122709 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.122708 2574 scope.go:117] "RemoveContainer" containerID="616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d" Apr 16 18:06:22.122961 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.122941 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d"} err="failed to get container status \"616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d\": rpc error: code = NotFound desc = could not find container \"616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d\": container with ID starting with 616772b99d44b466802ec0d9d27a767ead13c059a765ff285331cbd7b698c97d not found: ID does not exist" Apr 16 18:06:22.123008 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.122962 2574 scope.go:117] "RemoveContainer" containerID="47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955" Apr 16 18:06:22.123158 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.123142 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955"} err="failed to get container status \"47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955\": rpc error: code = NotFound desc = could not find container \"47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955\": container with ID starting with 47f5497e321f3d6869bc35ecc13439a51535f0c2621b5b2aa153fda0a8890955 not found: ID does not exist" Apr 16 18:06:22.123158 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.123158 2574 scope.go:117] "RemoveContainer" containerID="23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f" Apr 16 18:06:22.123380 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.123349 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f"} err="failed to get container status \"23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f\": rpc error: code = NotFound desc = could not find container \"23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f\": container with ID starting with 23a451c9c5ef6ced7b9dc7efd3f58b3c4ba162ad9ad6c3a9e6d2f686510b794f not found: ID does not exist" Apr 16 18:06:22.123457 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.123390 2574 scope.go:117] "RemoveContainer" containerID="99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92" Apr 16 18:06:22.123600 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.123584 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92"} err="failed to get container status \"99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92\": rpc error: code = NotFound desc = could not find container \"99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92\": container with ID starting with 99af27567cedf7dbade602848ff81892c901b189c746a6cf39dcaf022b6aff92 not found: ID does not exist" Apr 16 18:06:22.123643 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.123601 2574 scope.go:117] "RemoveContainer" containerID="e600c1290b0541b6ccddfb2f2984e3bc0c577e520abcbc91b8be3e5a4cd3f32a" Apr 16 18:06:22.123780 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.123765 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e600c1290b0541b6ccddfb2f2984e3bc0c577e520abcbc91b8be3e5a4cd3f32a"} err="failed to get container status \"e600c1290b0541b6ccddfb2f2984e3bc0c577e520abcbc91b8be3e5a4cd3f32a\": rpc error: code = NotFound desc = could not find container \"e600c1290b0541b6ccddfb2f2984e3bc0c577e520abcbc91b8be3e5a4cd3f32a\": container with ID starting with e600c1290b0541b6ccddfb2f2984e3bc0c577e520abcbc91b8be3e5a4cd3f32a not found: ID does not exist" Apr 16 18:06:22.136166 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136144 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:06:22.136482 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136468 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="kube-rbac-proxy-web" Apr 16 18:06:22.136536 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136484 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="kube-rbac-proxy-web" Apr 16 18:06:22.136536 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136494 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="prom-label-proxy" Apr 16 18:06:22.136536 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136499 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="prom-label-proxy" Apr 16 18:06:22.136536 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136512 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290" containerName="registry" Apr 16 18:06:22.136536 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136518 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290" containerName="registry" Apr 16 18:06:22.136536 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136526 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="init-config-reloader" Apr 16 18:06:22.136536 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136532 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="init-config-reloader" Apr 16 18:06:22.136536 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136538 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="config-reloader" Apr 16 18:06:22.136760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136543 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="config-reloader" Apr 16 18:06:22.136760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136550 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="kube-rbac-proxy" Apr 16 18:06:22.136760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136556 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="kube-rbac-proxy" Apr 16 18:06:22.136760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136563 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="alertmanager" Apr 16 18:06:22.136760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136567 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="alertmanager" Apr 16 18:06:22.136760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136575 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="kube-rbac-proxy-metric" Apr 16 18:06:22.136760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136579 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="kube-rbac-proxy-metric" Apr 16 18:06:22.136760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136625 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="kube-rbac-proxy-metric" Apr 16 18:06:22.136760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136634 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="alertmanager" Apr 16 18:06:22.136760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136640 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="prom-label-proxy" Apr 16 18:06:22.136760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136647 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="10c9c7b3-7baa-4a5f-9b87-9e76bbd6a290" containerName="registry" Apr 16 18:06:22.136760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136655 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="kube-rbac-proxy-web" Apr 16 18:06:22.136760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136662 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="config-reloader" Apr 16 18:06:22.136760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.136667 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="389d15fc-b614-4159-bec8-16bc0961e897" containerName="kube-rbac-proxy" Apr 16 18:06:22.141065 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.141050 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.144217 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.144199 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:06:22.144296 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.144201 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:06:22.144670 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.144654 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:06:22.144883 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.144867 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:06:22.144957 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.144870 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:06:22.145211 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.145195 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:06:22.145770 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.145752 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-sgb56\"" Apr 16 18:06:22.145962 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.145949 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:06:22.146024 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.145953 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:06:22.151181 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.151162 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:06:22.153295 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.153275 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:06:22.263291 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.263238 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7a81f23d-53f8-4572-bd35-b6bda49d0423-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.263291 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.263265 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shqqh\" (UniqueName: \"kubernetes.io/projected/7a81f23d-53f8-4572-bd35-b6bda49d0423-kube-api-access-shqqh\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.263291 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.263290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-config-volume\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.263453 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.263343 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7a81f23d-53f8-4572-bd35-b6bda49d0423-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.263453 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.263400 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.263453 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.263428 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.263453 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.263446 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a81f23d-53f8-4572-bd35-b6bda49d0423-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.263570 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.263471 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.263570 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.263495 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.263570 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.263510 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.263570 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.263526 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-web-config\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.263570 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.263550 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a81f23d-53f8-4572-bd35-b6bda49d0423-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.263709 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.263580 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7a81f23d-53f8-4572-bd35-b6bda49d0423-config-out\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.294974 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.294951 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="389d15fc-b614-4159-bec8-16bc0961e897" path="/var/lib/kubelet/pods/389d15fc-b614-4159-bec8-16bc0961e897/volumes" Apr 16 18:06:22.364414 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.364386 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-config-volume\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.364491 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.364423 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7a81f23d-53f8-4572-bd35-b6bda49d0423-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.364491 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.364440 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.364491 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.364461 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.364491 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.364479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a81f23d-53f8-4572-bd35-b6bda49d0423-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.364676 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.364495 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.364676 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.364512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.364676 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.364560 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.364676 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.364602 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-web-config\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.364676 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.364631 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a81f23d-53f8-4572-bd35-b6bda49d0423-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.364909 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.364676 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7a81f23d-53f8-4572-bd35-b6bda49d0423-config-out\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.364909 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.364743 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7a81f23d-53f8-4572-bd35-b6bda49d0423-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.364909 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.364770 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shqqh\" (UniqueName: \"kubernetes.io/projected/7a81f23d-53f8-4572-bd35-b6bda49d0423-kube-api-access-shqqh\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.365848 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.365512 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a81f23d-53f8-4572-bd35-b6bda49d0423-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.365848 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.365692 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a81f23d-53f8-4572-bd35-b6bda49d0423-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.367257 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.367225 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-config-volume\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.367257 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.367225 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.367471 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.367446 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.367535 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.367491 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.367589 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.367531 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7a81f23d-53f8-4572-bd35-b6bda49d0423-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.367644 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.367609 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.367933 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.367910 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-web-config\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.368025 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.367937 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7a81f23d-53f8-4572-bd35-b6bda49d0423-config-out\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.368150 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.368132 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7a81f23d-53f8-4572-bd35-b6bda49d0423-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.369062 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.369044 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7a81f23d-53f8-4572-bd35-b6bda49d0423-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.373114 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.373096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shqqh\" (UniqueName: \"kubernetes.io/projected/7a81f23d-53f8-4572-bd35-b6bda49d0423-kube-api-access-shqqh\") pod \"alertmanager-main-0\" (UID: \"7a81f23d-53f8-4572-bd35-b6bda49d0423\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.451093 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.451071 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:22.579231 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:22.579205 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:06:22.582817 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:06:22.582789 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a81f23d_53f8_4572_bd35_b6bda49d0423.slice/crio-9722f3305967a999827c791c74e250f98b9ff2c8dd9ed158e1aecdc57d24f8da WatchSource:0}: Error finding container 9722f3305967a999827c791c74e250f98b9ff2c8dd9ed158e1aecdc57d24f8da: Status 404 returned error can't find the container with id 9722f3305967a999827c791c74e250f98b9ff2c8dd9ed158e1aecdc57d24f8da Apr 16 18:06:23.082136 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:23.082102 2574 generic.go:358] "Generic (PLEG): container finished" podID="7a81f23d-53f8-4572-bd35-b6bda49d0423" containerID="58802af8ecaf847873e3f77a0ebf241569aee21c3e309a64380d5704ee7fa2b2" exitCode=0 Apr 16 18:06:23.082527 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:23.082190 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7a81f23d-53f8-4572-bd35-b6bda49d0423","Type":"ContainerDied","Data":"58802af8ecaf847873e3f77a0ebf241569aee21c3e309a64380d5704ee7fa2b2"} Apr 16 18:06:23.082527 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:23.082238 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7a81f23d-53f8-4572-bd35-b6bda49d0423","Type":"ContainerStarted","Data":"9722f3305967a999827c791c74e250f98b9ff2c8dd9ed158e1aecdc57d24f8da"} Apr 16 18:06:24.089228 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.089193 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7a81f23d-53f8-4572-bd35-b6bda49d0423","Type":"ContainerStarted","Data":"4f8ae09ccf398238493f2edae73f1c3773fe1e6243d6db0c9589dcc2288010c3"} Apr 16 18:06:24.089228 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.089228 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7a81f23d-53f8-4572-bd35-b6bda49d0423","Type":"ContainerStarted","Data":"27147811b010024b09cd8a5c3423228b4cea7e6024d63f75f47e68a60959e05e"} Apr 16 18:06:24.089643 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.089239 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7a81f23d-53f8-4572-bd35-b6bda49d0423","Type":"ContainerStarted","Data":"f90d6eabe30e22c7ffa060129d8086bdd6c200f6d686f8eae64eb496c62a3f51"} Apr 16 18:06:24.089643 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.089247 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7a81f23d-53f8-4572-bd35-b6bda49d0423","Type":"ContainerStarted","Data":"63cb9c1732f51af2ef65066d230f30f3515f1877d0f6ce6de12298e8e6201b59"} Apr 16 18:06:24.089643 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.089255 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7a81f23d-53f8-4572-bd35-b6bda49d0423","Type":"ContainerStarted","Data":"84be57ee845e8209f1876b59fc036f45d8ca085f02ccc5e5def7217d84da9a54"} Apr 16 18:06:24.089643 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.089264 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7a81f23d-53f8-4572-bd35-b6bda49d0423","Type":"ContainerStarted","Data":"526808d90171445994ed001a41eb456b255be571d5630ada1d77404ff01bf892"} Apr 16 18:06:24.120006 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.119961 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.119947698 podStartE2EDuration="2.119947698s" podCreationTimestamp="2026-04-16 18:06:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:06:24.11887096 +0000 UTC m=+264.396284852" watchObservedRunningTime="2026-04-16 18:06:24.119947698 +0000 UTC m=+264.397361592" Apr 16 18:06:24.642057 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.642028 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-8684f79487-sfltp"] Apr 16 18:06:24.645463 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.645445 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.649612 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.649592 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 18:06:24.649815 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.649752 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 18:06:24.649873 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.649844 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-5prpn\"" Apr 16 18:06:24.649992 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.649975 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 18:06:24.650089 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.650074 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 18:06:24.650138 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.650109 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 18:06:24.656881 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.656698 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 18:06:24.657413 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.657391 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-8684f79487-sfltp"] Apr 16 18:06:24.788736 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.788717 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01950832-b73a-42d3-a253-92a4760be179-metrics-client-ca\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.788845 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.788757 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/01950832-b73a-42d3-a253-92a4760be179-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.788845 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.788785 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01950832-b73a-42d3-a253-92a4760be179-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.788845 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.788823 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01950832-b73a-42d3-a253-92a4760be179-serving-certs-ca-bundle\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.788940 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.788848 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/01950832-b73a-42d3-a253-92a4760be179-federate-client-tls\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.788940 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.788864 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6znj\" (UniqueName: \"kubernetes.io/projected/01950832-b73a-42d3-a253-92a4760be179-kube-api-access-g6znj\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.788940 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.788880 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/01950832-b73a-42d3-a253-92a4760be179-secret-telemeter-client\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.788940 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.788918 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/01950832-b73a-42d3-a253-92a4760be179-telemeter-client-tls\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.890219 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.890192 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/01950832-b73a-42d3-a253-92a4760be179-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.890337 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.890241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01950832-b73a-42d3-a253-92a4760be179-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.890337 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.890276 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01950832-b73a-42d3-a253-92a4760be179-serving-certs-ca-bundle\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.890474 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.890416 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/01950832-b73a-42d3-a253-92a4760be179-federate-client-tls\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.890474 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.890449 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6znj\" (UniqueName: \"kubernetes.io/projected/01950832-b73a-42d3-a253-92a4760be179-kube-api-access-g6znj\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.890575 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.890479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/01950832-b73a-42d3-a253-92a4760be179-secret-telemeter-client\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.890575 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.890508 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/01950832-b73a-42d3-a253-92a4760be179-telemeter-client-tls\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.890677 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.890578 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01950832-b73a-42d3-a253-92a4760be179-metrics-client-ca\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.891063 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.891015 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01950832-b73a-42d3-a253-92a4760be179-serving-certs-ca-bundle\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.891287 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.891243 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01950832-b73a-42d3-a253-92a4760be179-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.891531 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.891508 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01950832-b73a-42d3-a253-92a4760be179-metrics-client-ca\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.892994 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.892945 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/01950832-b73a-42d3-a253-92a4760be179-telemeter-client-tls\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.892994 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.892972 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/01950832-b73a-42d3-a253-92a4760be179-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.893125 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.893107 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/01950832-b73a-42d3-a253-92a4760be179-secret-telemeter-client\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.893160 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.893142 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/01950832-b73a-42d3-a253-92a4760be179-federate-client-tls\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.907951 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.907932 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6znj\" (UniqueName: \"kubernetes.io/projected/01950832-b73a-42d3-a253-92a4760be179-kube-api-access-g6znj\") pod \"telemeter-client-8684f79487-sfltp\" (UID: \"01950832-b73a-42d3-a253-92a4760be179\") " pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:24.955614 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:24.955593 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" Apr 16 18:06:25.080754 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:25.080724 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-8684f79487-sfltp"] Apr 16 18:06:25.083965 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:06:25.083936 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01950832_b73a_42d3_a253_92a4760be179.slice/crio-55b412890bb6bba441dcad9130301027eac48a84b7cb0355af621863b596e43e WatchSource:0}: Error finding container 55b412890bb6bba441dcad9130301027eac48a84b7cb0355af621863b596e43e: Status 404 returned error can't find the container with id 55b412890bb6bba441dcad9130301027eac48a84b7cb0355af621863b596e43e Apr 16 18:06:25.092826 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:25.092802 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" event={"ID":"01950832-b73a-42d3-a253-92a4760be179","Type":"ContainerStarted","Data":"55b412890bb6bba441dcad9130301027eac48a84b7cb0355af621863b596e43e"} Apr 16 18:06:27.111703 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:27.111616 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" event={"ID":"01950832-b73a-42d3-a253-92a4760be179","Type":"ContainerStarted","Data":"a77c68575c093b28595350dbfab761516fe04060659515d25698232c4d90e831"} Apr 16 18:06:27.111703 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:27.111667 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" event={"ID":"01950832-b73a-42d3-a253-92a4760be179","Type":"ContainerStarted","Data":"8f0da3aecd7b0b2d9c72fb286ce24a577ebe2d76afc5fca506175bea7a7098c7"} Apr 16 18:06:27.111703 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:27.111680 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" event={"ID":"01950832-b73a-42d3-a253-92a4760be179","Type":"ContainerStarted","Data":"74e4f970d0ee3fb03f7164944000eda424cd3b1a483d6d794518adb440e81e0e"} Apr 16 18:06:27.140625 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:06:27.140579 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-8684f79487-sfltp" podStartSLOduration=1.407993394 podStartE2EDuration="3.140564685s" podCreationTimestamp="2026-04-16 18:06:24 +0000 UTC" firstStartedPulling="2026-04-16 18:06:25.085889412 +0000 UTC m=+265.363303284" lastFinishedPulling="2026-04-16 18:06:26.8184607 +0000 UTC m=+267.095874575" observedRunningTime="2026-04-16 18:06:27.135763774 +0000 UTC m=+267.413177666" watchObservedRunningTime="2026-04-16 18:06:27.140564685 +0000 UTC m=+267.417978580" Apr 16 18:07:00.171951 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:07:00.171919 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-gssjt_dc84b07b-78de-4ae8-bb54-38e289deb5f1/console-operator/1.log" Apr 16 18:07:00.172549 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:07:00.172220 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-gssjt_dc84b07b-78de-4ae8-bb54-38e289deb5f1/console-operator/1.log" Apr 16 18:07:00.188803 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:07:00.188783 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:08:18.853910 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:18.853868 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2"] Apr 16 18:08:18.856477 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:18.856456 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" Apr 16 18:08:18.858716 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:18.858687 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:08:18.858885 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:18.858719 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-dgz9b\"" Apr 16 18:08:18.858885 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:18.858737 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:08:18.865228 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:18.865198 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2"] Apr 16 18:08:18.989017 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:18.988987 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2\" (UID: \"8a552f8a-35ad-4166-b1fc-bddfe78dfc79\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" Apr 16 18:08:18.989168 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:18.989034 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2\" (UID: \"8a552f8a-35ad-4166-b1fc-bddfe78dfc79\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" Apr 16 18:08:18.989168 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:18.989060 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x57d8\" (UniqueName: \"kubernetes.io/projected/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-kube-api-access-x57d8\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2\" (UID: \"8a552f8a-35ad-4166-b1fc-bddfe78dfc79\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" Apr 16 18:08:19.090301 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:19.090253 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2\" (UID: \"8a552f8a-35ad-4166-b1fc-bddfe78dfc79\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" Apr 16 18:08:19.090445 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:19.090313 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2\" (UID: \"8a552f8a-35ad-4166-b1fc-bddfe78dfc79\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" Apr 16 18:08:19.090445 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:19.090345 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x57d8\" (UniqueName: \"kubernetes.io/projected/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-kube-api-access-x57d8\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2\" (UID: \"8a552f8a-35ad-4166-b1fc-bddfe78dfc79\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" Apr 16 18:08:19.090733 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:19.090711 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2\" (UID: \"8a552f8a-35ad-4166-b1fc-bddfe78dfc79\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" Apr 16 18:08:19.090775 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:19.090758 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2\" (UID: \"8a552f8a-35ad-4166-b1fc-bddfe78dfc79\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" Apr 16 18:08:19.100382 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:19.100340 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x57d8\" (UniqueName: \"kubernetes.io/projected/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-kube-api-access-x57d8\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2\" (UID: \"8a552f8a-35ad-4166-b1fc-bddfe78dfc79\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" Apr 16 18:08:19.168042 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:19.167988 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" Apr 16 18:08:19.289974 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:19.289949 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2"] Apr 16 18:08:19.292562 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:08:19.292533 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a552f8a_35ad_4166_b1fc_bddfe78dfc79.slice/crio-f5c72440c59a45c2cf8011e8b18fdf4ba523236e580efbea52335bbe39ab124b WatchSource:0}: Error finding container f5c72440c59a45c2cf8011e8b18fdf4ba523236e580efbea52335bbe39ab124b: Status 404 returned error can't find the container with id f5c72440c59a45c2cf8011e8b18fdf4ba523236e580efbea52335bbe39ab124b Apr 16 18:08:19.294539 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:19.294521 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:08:19.444692 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:19.444667 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" event={"ID":"8a552f8a-35ad-4166-b1fc-bddfe78dfc79","Type":"ContainerStarted","Data":"f5c72440c59a45c2cf8011e8b18fdf4ba523236e580efbea52335bbe39ab124b"} Apr 16 18:08:24.463959 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:24.463917 2574 generic.go:358] "Generic (PLEG): container finished" podID="8a552f8a-35ad-4166-b1fc-bddfe78dfc79" containerID="436dc9c3226e5d58378905f2cbea8b1ada96a2300cd9143165a547250e4a0b98" exitCode=0 Apr 16 18:08:24.464463 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:24.464011 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" event={"ID":"8a552f8a-35ad-4166-b1fc-bddfe78dfc79","Type":"ContainerDied","Data":"436dc9c3226e5d58378905f2cbea8b1ada96a2300cd9143165a547250e4a0b98"} Apr 16 18:08:26.471355 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:26.471326 2574 generic.go:358] "Generic (PLEG): container finished" podID="8a552f8a-35ad-4166-b1fc-bddfe78dfc79" containerID="cdb3875150ea27e5c386f1c85acbca9f15cf4e9dd417540607bb310f6861f586" exitCode=0 Apr 16 18:08:26.471661 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:26.471408 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" event={"ID":"8a552f8a-35ad-4166-b1fc-bddfe78dfc79","Type":"ContainerDied","Data":"cdb3875150ea27e5c386f1c85acbca9f15cf4e9dd417540607bb310f6861f586"} Apr 16 18:08:32.495663 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:32.495634 2574 generic.go:358] "Generic (PLEG): container finished" podID="8a552f8a-35ad-4166-b1fc-bddfe78dfc79" containerID="b788e63bd8e346188b9229d3b0c28a25802b9fb5cffa3a50874797d442d00a3a" exitCode=0 Apr 16 18:08:32.495976 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:32.495676 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" event={"ID":"8a552f8a-35ad-4166-b1fc-bddfe78dfc79","Type":"ContainerDied","Data":"b788e63bd8e346188b9229d3b0c28a25802b9fb5cffa3a50874797d442d00a3a"} Apr 16 18:08:33.627428 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:33.627408 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" Apr 16 18:08:33.717101 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:33.717079 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-util\") pod \"8a552f8a-35ad-4166-b1fc-bddfe78dfc79\" (UID: \"8a552f8a-35ad-4166-b1fc-bddfe78dfc79\") " Apr 16 18:08:33.717211 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:33.717150 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x57d8\" (UniqueName: \"kubernetes.io/projected/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-kube-api-access-x57d8\") pod \"8a552f8a-35ad-4166-b1fc-bddfe78dfc79\" (UID: \"8a552f8a-35ad-4166-b1fc-bddfe78dfc79\") " Apr 16 18:08:33.717261 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:33.717210 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-bundle\") pod \"8a552f8a-35ad-4166-b1fc-bddfe78dfc79\" (UID: \"8a552f8a-35ad-4166-b1fc-bddfe78dfc79\") " Apr 16 18:08:33.717764 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:33.717733 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-bundle" (OuterVolumeSpecName: "bundle") pod "8a552f8a-35ad-4166-b1fc-bddfe78dfc79" (UID: "8a552f8a-35ad-4166-b1fc-bddfe78dfc79"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:33.719289 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:33.719268 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-kube-api-access-x57d8" (OuterVolumeSpecName: "kube-api-access-x57d8") pod "8a552f8a-35ad-4166-b1fc-bddfe78dfc79" (UID: "8a552f8a-35ad-4166-b1fc-bddfe78dfc79"). InnerVolumeSpecName "kube-api-access-x57d8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:08:33.720967 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:33.720942 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-util" (OuterVolumeSpecName: "util") pod "8a552f8a-35ad-4166-b1fc-bddfe78dfc79" (UID: "8a552f8a-35ad-4166-b1fc-bddfe78dfc79"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:33.818517 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:33.818462 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x57d8\" (UniqueName: \"kubernetes.io/projected/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-kube-api-access-x57d8\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:08:33.818517 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:33.818485 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-bundle\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:08:33.818517 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:33.818499 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a552f8a-35ad-4166-b1fc-bddfe78dfc79-util\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:08:34.503645 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:34.503621 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" Apr 16 18:08:34.503645 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:34.503615 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6mqk2" event={"ID":"8a552f8a-35ad-4166-b1fc-bddfe78dfc79","Type":"ContainerDied","Data":"f5c72440c59a45c2cf8011e8b18fdf4ba523236e580efbea52335bbe39ab124b"} Apr 16 18:08:34.503800 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:34.503657 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5c72440c59a45c2cf8011e8b18fdf4ba523236e580efbea52335bbe39ab124b" Apr 16 18:08:40.723931 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.723891 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h"] Apr 16 18:08:40.724360 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.724200 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a552f8a-35ad-4166-b1fc-bddfe78dfc79" containerName="pull" Apr 16 18:08:40.724360 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.724211 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a552f8a-35ad-4166-b1fc-bddfe78dfc79" containerName="pull" Apr 16 18:08:40.724360 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.724220 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a552f8a-35ad-4166-b1fc-bddfe78dfc79" containerName="extract" Apr 16 18:08:40.724360 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.724226 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a552f8a-35ad-4166-b1fc-bddfe78dfc79" containerName="extract" Apr 16 18:08:40.724360 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.724242 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a552f8a-35ad-4166-b1fc-bddfe78dfc79" containerName="util" Apr 16 18:08:40.724360 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.724248 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a552f8a-35ad-4166-b1fc-bddfe78dfc79" containerName="util" Apr 16 18:08:40.724360 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.724297 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a552f8a-35ad-4166-b1fc-bddfe78dfc79" containerName="extract" Apr 16 18:08:40.730351 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.730332 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h" Apr 16 18:08:40.732318 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.732288 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 18:08:40.732436 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.732353 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 18:08:40.732436 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.732385 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-2r8fp\"" Apr 16 18:08:40.732436 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.732420 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 18:08:40.739321 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.739301 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h"] Apr 16 18:08:40.865037 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.865003 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnkrm\" (UniqueName: \"kubernetes.io/projected/d0bab7e4-2116-4568-9144-427783f0a9c7-kube-api-access-gnkrm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h\" (UID: \"d0bab7e4-2116-4568-9144-427783f0a9c7\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h" Apr 16 18:08:40.865156 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.865074 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/d0bab7e4-2116-4568-9144-427783f0a9c7-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h\" (UID: \"d0bab7e4-2116-4568-9144-427783f0a9c7\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h" Apr 16 18:08:40.966236 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.966213 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/d0bab7e4-2116-4568-9144-427783f0a9c7-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h\" (UID: \"d0bab7e4-2116-4568-9144-427783f0a9c7\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h" Apr 16 18:08:40.966341 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.966283 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnkrm\" (UniqueName: \"kubernetes.io/projected/d0bab7e4-2116-4568-9144-427783f0a9c7-kube-api-access-gnkrm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h\" (UID: \"d0bab7e4-2116-4568-9144-427783f0a9c7\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h" Apr 16 18:08:40.968553 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.968532 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/d0bab7e4-2116-4568-9144-427783f0a9c7-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h\" (UID: \"d0bab7e4-2116-4568-9144-427783f0a9c7\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h" Apr 16 18:08:40.982961 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:40.982903 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnkrm\" (UniqueName: \"kubernetes.io/projected/d0bab7e4-2116-4568-9144-427783f0a9c7-kube-api-access-gnkrm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h\" (UID: \"d0bab7e4-2116-4568-9144-427783f0a9c7\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h" Apr 16 18:08:41.041253 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:41.041231 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h" Apr 16 18:08:41.164819 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:41.164793 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h"] Apr 16 18:08:41.167318 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:08:41.167282 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0bab7e4_2116_4568_9144_427783f0a9c7.slice/crio-3ae6cdbb392393715ff265915187e74cf019c4571be7478468387aac8632aab5 WatchSource:0}: Error finding container 3ae6cdbb392393715ff265915187e74cf019c4571be7478468387aac8632aab5: Status 404 returned error can't find the container with id 3ae6cdbb392393715ff265915187e74cf019c4571be7478468387aac8632aab5 Apr 16 18:08:41.526699 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:41.526670 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h" event={"ID":"d0bab7e4-2116-4568-9144-427783f0a9c7","Type":"ContainerStarted","Data":"3ae6cdbb392393715ff265915187e74cf019c4571be7478468387aac8632aab5"} Apr 16 18:08:45.401213 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:45.401179 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lr6xp"] Apr 16 18:08:45.404784 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:45.404763 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" Apr 16 18:08:45.406785 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:45.406766 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 18:08:45.406911 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:45.406892 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-dlvgn\"" Apr 16 18:08:45.408251 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:45.408230 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 18:08:45.415633 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:45.415612 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lr6xp"] Apr 16 18:08:45.502700 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:45.502667 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/1fa74311-6915-4c6d-970b-58c9c3cee554-cabundle0\") pod \"keda-operator-ffbb595cb-lr6xp\" (UID: \"1fa74311-6915-4c6d-970b-58c9c3cee554\") " pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" Apr 16 18:08:45.502816 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:45.502710 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxj5s\" (UniqueName: \"kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-kube-api-access-vxj5s\") pod \"keda-operator-ffbb595cb-lr6xp\" (UID: \"1fa74311-6915-4c6d-970b-58c9c3cee554\") " pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" Apr 16 18:08:45.502867 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:45.502833 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-certificates\") pod \"keda-operator-ffbb595cb-lr6xp\" (UID: \"1fa74311-6915-4c6d-970b-58c9c3cee554\") " pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" Apr 16 18:08:45.543887 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:45.543856 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h" event={"ID":"d0bab7e4-2116-4568-9144-427783f0a9c7","Type":"ContainerStarted","Data":"1ce52d3be1dd7bc96568f0d4583e2a5f71056ea895f08799ac96f468a8a15a39"} Apr 16 18:08:45.544024 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:45.544013 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h" Apr 16 18:08:45.560760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:45.560720 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h" podStartSLOduration=1.926983174 podStartE2EDuration="5.560706614s" podCreationTimestamp="2026-04-16 18:08:40 +0000 UTC" firstStartedPulling="2026-04-16 18:08:41.169576295 +0000 UTC m=+401.446990172" lastFinishedPulling="2026-04-16 18:08:44.803299736 +0000 UTC m=+405.080713612" observedRunningTime="2026-04-16 18:08:45.559635654 +0000 UTC m=+405.837049548" watchObservedRunningTime="2026-04-16 18:08:45.560706614 +0000 UTC m=+405.838120508" Apr 16 18:08:45.603272 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:45.603248 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/1fa74311-6915-4c6d-970b-58c9c3cee554-cabundle0\") pod \"keda-operator-ffbb595cb-lr6xp\" (UID: \"1fa74311-6915-4c6d-970b-58c9c3cee554\") " pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" Apr 16 18:08:45.603407 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:45.603285 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxj5s\" (UniqueName: \"kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-kube-api-access-vxj5s\") pod \"keda-operator-ffbb595cb-lr6xp\" (UID: \"1fa74311-6915-4c6d-970b-58c9c3cee554\") " pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" Apr 16 18:08:45.603480 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:45.603415 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-certificates\") pod \"keda-operator-ffbb595cb-lr6xp\" (UID: \"1fa74311-6915-4c6d-970b-58c9c3cee554\") " pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" Apr 16 18:08:45.603537 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:08:45.603514 2574 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:08:45.603537 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:08:45.603526 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:08:45.603642 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:08:45.603543 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lr6xp: references non-existent secret key: ca.crt Apr 16 18:08:45.603642 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:08:45.603598 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-certificates podName:1fa74311-6915-4c6d-970b-58c9c3cee554 nodeName:}" failed. No retries permitted until 2026-04-16 18:08:46.103583322 +0000 UTC m=+406.380997194 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-certificates") pod "keda-operator-ffbb595cb-lr6xp" (UID: "1fa74311-6915-4c6d-970b-58c9c3cee554") : references non-existent secret key: ca.crt Apr 16 18:08:45.604008 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:45.603987 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/1fa74311-6915-4c6d-970b-58c9c3cee554-cabundle0\") pod \"keda-operator-ffbb595cb-lr6xp\" (UID: \"1fa74311-6915-4c6d-970b-58c9c3cee554\") " pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" Apr 16 18:08:45.613038 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:45.613012 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxj5s\" (UniqueName: \"kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-kube-api-access-vxj5s\") pod \"keda-operator-ffbb595cb-lr6xp\" (UID: \"1fa74311-6915-4c6d-970b-58c9c3cee554\") " pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" Apr 16 18:08:46.108259 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:46.108220 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-certificates\") pod \"keda-operator-ffbb595cb-lr6xp\" (UID: \"1fa74311-6915-4c6d-970b-58c9c3cee554\") " pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" Apr 16 18:08:46.108465 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:08:46.108410 2574 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:08:46.108465 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:08:46.108434 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:08:46.108465 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:08:46.108447 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lr6xp: references non-existent secret key: ca.crt Apr 16 18:08:46.108631 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:08:46.108511 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-certificates podName:1fa74311-6915-4c6d-970b-58c9c3cee554 nodeName:}" failed. No retries permitted until 2026-04-16 18:08:47.108492535 +0000 UTC m=+407.385906411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-certificates") pod "keda-operator-ffbb595cb-lr6xp" (UID: "1fa74311-6915-4c6d-970b-58c9c3cee554") : references non-existent secret key: ca.crt Apr 16 18:08:47.116705 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:47.116671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-certificates\") pod \"keda-operator-ffbb595cb-lr6xp\" (UID: \"1fa74311-6915-4c6d-970b-58c9c3cee554\") " pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" Apr 16 18:08:47.117078 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:08:47.116802 2574 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:08:47.117078 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:08:47.116820 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:08:47.117078 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:08:47.116829 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lr6xp: references non-existent secret key: ca.crt Apr 16 18:08:47.117078 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:08:47.116877 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-certificates podName:1fa74311-6915-4c6d-970b-58c9c3cee554 nodeName:}" failed. No retries permitted until 2026-04-16 18:08:49.116863794 +0000 UTC m=+409.394277666 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-certificates") pod "keda-operator-ffbb595cb-lr6xp" (UID: "1fa74311-6915-4c6d-970b-58c9c3cee554") : references non-existent secret key: ca.crt Apr 16 18:08:49.133671 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:49.133641 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-certificates\") pod \"keda-operator-ffbb595cb-lr6xp\" (UID: \"1fa74311-6915-4c6d-970b-58c9c3cee554\") " pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" Apr 16 18:08:49.134134 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:08:49.133813 2574 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:08:49.134134 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:08:49.133834 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:08:49.134134 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:08:49.133847 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lr6xp: references non-existent secret key: ca.crt Apr 16 18:08:49.134134 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:08:49.133912 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-certificates podName:1fa74311-6915-4c6d-970b-58c9c3cee554 nodeName:}" failed. No retries permitted until 2026-04-16 18:08:53.13389339 +0000 UTC m=+413.411307265 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-certificates") pod "keda-operator-ffbb595cb-lr6xp" (UID: "1fa74311-6915-4c6d-970b-58c9c3cee554") : references non-existent secret key: ca.crt Apr 16 18:08:53.165483 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:53.165453 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-certificates\") pod \"keda-operator-ffbb595cb-lr6xp\" (UID: \"1fa74311-6915-4c6d-970b-58c9c3cee554\") " pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" Apr 16 18:08:53.167767 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:53.167746 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1fa74311-6915-4c6d-970b-58c9c3cee554-certificates\") pod \"keda-operator-ffbb595cb-lr6xp\" (UID: \"1fa74311-6915-4c6d-970b-58c9c3cee554\") " pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" Apr 16 18:08:53.214627 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:53.214602 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" Apr 16 18:08:53.333558 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:53.333534 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lr6xp"] Apr 16 18:08:53.336393 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:08:53.336345 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fa74311_6915_4c6d_970b_58c9c3cee554.slice/crio-533b267c7422876a5e8153ed900523739fa4878b851a5e4ff834fbaffc2475df WatchSource:0}: Error finding container 533b267c7422876a5e8153ed900523739fa4878b851a5e4ff834fbaffc2475df: Status 404 returned error can't find the container with id 533b267c7422876a5e8153ed900523739fa4878b851a5e4ff834fbaffc2475df Apr 16 18:08:53.566494 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:53.566461 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" event={"ID":"1fa74311-6915-4c6d-970b-58c9c3cee554","Type":"ContainerStarted","Data":"533b267c7422876a5e8153ed900523739fa4878b851a5e4ff834fbaffc2475df"} Apr 16 18:08:56.578151 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:56.578119 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" event={"ID":"1fa74311-6915-4c6d-970b-58c9c3cee554","Type":"ContainerStarted","Data":"05d55e768753a5cc3bbdb717b18e90156ff800161c32c2ae86853eafbeda9f10"} Apr 16 18:08:56.578498 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:56.578271 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" Apr 16 18:08:56.599950 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:08:56.599905 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" podStartSLOduration=8.565497306 podStartE2EDuration="11.599891878s" podCreationTimestamp="2026-04-16 18:08:45 +0000 UTC" firstStartedPulling="2026-04-16 18:08:53.337610789 +0000 UTC m=+413.615024660" lastFinishedPulling="2026-04-16 18:08:56.372005358 +0000 UTC m=+416.649419232" observedRunningTime="2026-04-16 18:08:56.598576038 +0000 UTC m=+416.875989937" watchObservedRunningTime="2026-04-16 18:08:56.599891878 +0000 UTC m=+416.877305779" Apr 16 18:09:06.548711 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:06.548681 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dbw4h" Apr 16 18:09:17.583787 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:17.583709 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-lr6xp" Apr 16 18:09:53.272606 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:53.272576 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-ksbhh"] Apr 16 18:09:53.275793 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:53.275775 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" Apr 16 18:09:53.278553 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:53.278517 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-p4wh6\"" Apr 16 18:09:53.278774 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:53.278757 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:09:53.278866 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:53.278757 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 18:09:53.279058 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:53.279043 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:09:53.287420 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:53.287399 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-ksbhh"] Apr 16 18:09:53.399336 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:53.399307 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82c42\" (UniqueName: \"kubernetes.io/projected/9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c-kube-api-access-82c42\") pod \"kserve-controller-manager-659c8cbdc-ksbhh\" (UID: \"9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c\") " pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" Apr 16 18:09:53.399448 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:53.399433 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c-cert\") pod \"kserve-controller-manager-659c8cbdc-ksbhh\" (UID: \"9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c\") " pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" Apr 16 18:09:53.500439 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:53.500412 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c-cert\") pod \"kserve-controller-manager-659c8cbdc-ksbhh\" (UID: \"9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c\") " pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" Apr 16 18:09:53.500541 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:53.500520 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82c42\" (UniqueName: \"kubernetes.io/projected/9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c-kube-api-access-82c42\") pod \"kserve-controller-manager-659c8cbdc-ksbhh\" (UID: \"9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c\") " pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" Apr 16 18:09:53.500541 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:09:53.500532 2574 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 18:09:53.500620 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:09:53.500615 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c-cert podName:9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c nodeName:}" failed. No retries permitted until 2026-04-16 18:09:54.000598651 +0000 UTC m=+474.278012523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c-cert") pod "kserve-controller-manager-659c8cbdc-ksbhh" (UID: "9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c") : secret "kserve-webhook-server-cert" not found Apr 16 18:09:53.508841 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:53.508821 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82c42\" (UniqueName: \"kubernetes.io/projected/9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c-kube-api-access-82c42\") pod \"kserve-controller-manager-659c8cbdc-ksbhh\" (UID: \"9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c\") " pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" Apr 16 18:09:54.004485 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:54.004454 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c-cert\") pod \"kserve-controller-manager-659c8cbdc-ksbhh\" (UID: \"9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c\") " pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" Apr 16 18:09:54.006838 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:54.006810 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c-cert\") pod \"kserve-controller-manager-659c8cbdc-ksbhh\" (UID: \"9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c\") " pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" Apr 16 18:09:54.186858 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:54.186823 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" Apr 16 18:09:54.512386 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:54.512328 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-ksbhh"] Apr 16 18:09:54.515098 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:09:54.515073 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f3e9620_8f7d_4fb9_bb69_827aff5d9d9c.slice/crio-48a46bf9f318f0f433a9ca2c6487411ddb0753154c55c23415a7eaf43b6c2c47 WatchSource:0}: Error finding container 48a46bf9f318f0f433a9ca2c6487411ddb0753154c55c23415a7eaf43b6c2c47: Status 404 returned error can't find the container with id 48a46bf9f318f0f433a9ca2c6487411ddb0753154c55c23415a7eaf43b6c2c47 Apr 16 18:09:54.781265 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:54.781205 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" event={"ID":"9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c","Type":"ContainerStarted","Data":"48a46bf9f318f0f433a9ca2c6487411ddb0753154c55c23415a7eaf43b6c2c47"} Apr 16 18:09:57.795187 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:57.795154 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" event={"ID":"9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c","Type":"ContainerStarted","Data":"5bd79c1d15ffd6dfe9ff4a2676fb60a0550a672820020911a2810de934b32be4"} Apr 16 18:09:57.795545 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:57.795329 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" Apr 16 18:09:57.812510 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:09:57.812471 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" podStartSLOduration=2.278722127 podStartE2EDuration="4.81245716s" podCreationTimestamp="2026-04-16 18:09:53 +0000 UTC" firstStartedPulling="2026-04-16 18:09:54.516682409 +0000 UTC m=+474.794096281" lastFinishedPulling="2026-04-16 18:09:57.050417432 +0000 UTC m=+477.327831314" observedRunningTime="2026-04-16 18:09:57.810848876 +0000 UTC m=+478.088262770" watchObservedRunningTime="2026-04-16 18:09:57.81245716 +0000 UTC m=+478.089871054" Apr 16 18:10:28.627947 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.627905 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-ksbhh"] Apr 16 18:10:28.628543 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.628171 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" podUID="9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c" containerName="manager" containerID="cri-o://5bd79c1d15ffd6dfe9ff4a2676fb60a0550a672820020911a2810de934b32be4" gracePeriod=10 Apr 16 18:10:28.633716 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.633687 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" Apr 16 18:10:28.657087 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.657059 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-gxhdn"] Apr 16 18:10:28.660957 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.660931 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-gxhdn" Apr 16 18:10:28.671071 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.671046 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-gxhdn"] Apr 16 18:10:28.673330 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.673306 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c593e40a-0664-4c32-ad74-e3fa5a8d18e0-cert\") pod \"kserve-controller-manager-659c8cbdc-gxhdn\" (UID: \"c593e40a-0664-4c32-ad74-e3fa5a8d18e0\") " pod="kserve/kserve-controller-manager-659c8cbdc-gxhdn" Apr 16 18:10:28.673497 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.673476 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q72q\" (UniqueName: \"kubernetes.io/projected/c593e40a-0664-4c32-ad74-e3fa5a8d18e0-kube-api-access-5q72q\") pod \"kserve-controller-manager-659c8cbdc-gxhdn\" (UID: \"c593e40a-0664-4c32-ad74-e3fa5a8d18e0\") " pod="kserve/kserve-controller-manager-659c8cbdc-gxhdn" Apr 16 18:10:28.774558 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.774533 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5q72q\" (UniqueName: \"kubernetes.io/projected/c593e40a-0664-4c32-ad74-e3fa5a8d18e0-kube-api-access-5q72q\") pod \"kserve-controller-manager-659c8cbdc-gxhdn\" (UID: \"c593e40a-0664-4c32-ad74-e3fa5a8d18e0\") " pod="kserve/kserve-controller-manager-659c8cbdc-gxhdn" Apr 16 18:10:28.774676 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.774663 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c593e40a-0664-4c32-ad74-e3fa5a8d18e0-cert\") pod \"kserve-controller-manager-659c8cbdc-gxhdn\" (UID: \"c593e40a-0664-4c32-ad74-e3fa5a8d18e0\") " pod="kserve/kserve-controller-manager-659c8cbdc-gxhdn" Apr 16 18:10:28.776862 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.776845 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c593e40a-0664-4c32-ad74-e3fa5a8d18e0-cert\") pod \"kserve-controller-manager-659c8cbdc-gxhdn\" (UID: \"c593e40a-0664-4c32-ad74-e3fa5a8d18e0\") " pod="kserve/kserve-controller-manager-659c8cbdc-gxhdn" Apr 16 18:10:28.782296 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.782275 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q72q\" (UniqueName: \"kubernetes.io/projected/c593e40a-0664-4c32-ad74-e3fa5a8d18e0-kube-api-access-5q72q\") pod \"kserve-controller-manager-659c8cbdc-gxhdn\" (UID: \"c593e40a-0664-4c32-ad74-e3fa5a8d18e0\") " pod="kserve/kserve-controller-manager-659c8cbdc-gxhdn" Apr 16 18:10:28.861454 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.861436 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" Apr 16 18:10:28.875659 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.875639 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c-cert\") pod \"9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c\" (UID: \"9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c\") " Apr 16 18:10:28.875740 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.875669 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82c42\" (UniqueName: \"kubernetes.io/projected/9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c-kube-api-access-82c42\") pod \"9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c\" (UID: \"9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c\") " Apr 16 18:10:28.878157 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.878091 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c-cert" (OuterVolumeSpecName: "cert") pod "9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c" (UID: "9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:10:28.878157 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.878110 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c-kube-api-access-82c42" (OuterVolumeSpecName: "kube-api-access-82c42") pod "9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c" (UID: "9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c"). InnerVolumeSpecName "kube-api-access-82c42". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:10:28.901470 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.901440 2574 generic.go:358] "Generic (PLEG): container finished" podID="9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c" containerID="5bd79c1d15ffd6dfe9ff4a2676fb60a0550a672820020911a2810de934b32be4" exitCode=0 Apr 16 18:10:28.901572 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.901507 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" Apr 16 18:10:28.901572 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.901528 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" event={"ID":"9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c","Type":"ContainerDied","Data":"5bd79c1d15ffd6dfe9ff4a2676fb60a0550a672820020911a2810de934b32be4"} Apr 16 18:10:28.901572 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.901569 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-ksbhh" event={"ID":"9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c","Type":"ContainerDied","Data":"48a46bf9f318f0f433a9ca2c6487411ddb0753154c55c23415a7eaf43b6c2c47"} Apr 16 18:10:28.901686 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.901590 2574 scope.go:117] "RemoveContainer" containerID="5bd79c1d15ffd6dfe9ff4a2676fb60a0550a672820020911a2810de934b32be4" Apr 16 18:10:28.909611 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.909593 2574 scope.go:117] "RemoveContainer" containerID="5bd79c1d15ffd6dfe9ff4a2676fb60a0550a672820020911a2810de934b32be4" Apr 16 18:10:28.909866 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:10:28.909843 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bd79c1d15ffd6dfe9ff4a2676fb60a0550a672820020911a2810de934b32be4\": container with ID starting with 5bd79c1d15ffd6dfe9ff4a2676fb60a0550a672820020911a2810de934b32be4 not found: ID does not exist" containerID="5bd79c1d15ffd6dfe9ff4a2676fb60a0550a672820020911a2810de934b32be4" Apr 16 18:10:28.909917 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.909877 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bd79c1d15ffd6dfe9ff4a2676fb60a0550a672820020911a2810de934b32be4"} err="failed to get container status \"5bd79c1d15ffd6dfe9ff4a2676fb60a0550a672820020911a2810de934b32be4\": rpc error: code = NotFound desc = could not find container \"5bd79c1d15ffd6dfe9ff4a2676fb60a0550a672820020911a2810de934b32be4\": container with ID starting with 5bd79c1d15ffd6dfe9ff4a2676fb60a0550a672820020911a2810de934b32be4 not found: ID does not exist" Apr 16 18:10:28.922358 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.922336 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-ksbhh"] Apr 16 18:10:28.926835 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.926815 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-ksbhh"] Apr 16 18:10:28.976770 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.976746 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-82c42\" (UniqueName: \"kubernetes.io/projected/9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c-kube-api-access-82c42\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:10:28.976770 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:28.976766 2574 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c-cert\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:10:29.003656 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:29.003636 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-gxhdn" Apr 16 18:10:29.131008 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:29.130986 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-gxhdn"] Apr 16 18:10:29.132835 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:10:29.132805 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc593e40a_0664_4c32_ad74_e3fa5a8d18e0.slice/crio-ea6d327cbd19a754ef2312ef0fc5eb407a2c9545e255152718a430f1a75ec6a3 WatchSource:0}: Error finding container ea6d327cbd19a754ef2312ef0fc5eb407a2c9545e255152718a430f1a75ec6a3: Status 404 returned error can't find the container with id ea6d327cbd19a754ef2312ef0fc5eb407a2c9545e255152718a430f1a75ec6a3 Apr 16 18:10:29.908271 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:29.908190 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-gxhdn" event={"ID":"c593e40a-0664-4c32-ad74-e3fa5a8d18e0","Type":"ContainerStarted","Data":"311dbf15a291e6c2affe73296e9d8f3387b41b3284c0d75bdf1ab3a93284312f"} Apr 16 18:10:29.908271 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:29.908234 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-gxhdn" event={"ID":"c593e40a-0664-4c32-ad74-e3fa5a8d18e0","Type":"ContainerStarted","Data":"ea6d327cbd19a754ef2312ef0fc5eb407a2c9545e255152718a430f1a75ec6a3"} Apr 16 18:10:29.908686 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:29.908337 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-659c8cbdc-gxhdn" Apr 16 18:10:29.928251 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:29.928200 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-659c8cbdc-gxhdn" podStartSLOduration=1.459248098 podStartE2EDuration="1.928188228s" podCreationTimestamp="2026-04-16 18:10:28 +0000 UTC" firstStartedPulling="2026-04-16 18:10:29.134112639 +0000 UTC m=+509.411526515" lastFinishedPulling="2026-04-16 18:10:29.603052774 +0000 UTC m=+509.880466645" observedRunningTime="2026-04-16 18:10:29.92764428 +0000 UTC m=+510.205058174" watchObservedRunningTime="2026-04-16 18:10:29.928188228 +0000 UTC m=+510.205602155" Apr 16 18:10:30.296599 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:10:30.296565 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c" path="/var/lib/kubelet/pods/9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c/volumes" Apr 16 18:11:00.915757 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:00.915731 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-659c8cbdc-gxhdn" Apr 16 18:11:01.852015 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:01.851975 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-hcv89"] Apr 16 18:11:01.852519 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:01.852503 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c" containerName="manager" Apr 16 18:11:01.852577 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:01.852523 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c" containerName="manager" Apr 16 18:11:01.852619 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:01.852608 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f3e9620-8f7d-4fb9-bb69-827aff5d9d9c" containerName="manager" Apr 16 18:11:01.856883 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:01.856862 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-hcv89" Apr 16 18:11:01.858768 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:01.858744 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 18:11:01.859237 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:01.859215 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-7tdm2\"" Apr 16 18:11:01.865266 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:01.865239 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-hcv89"] Apr 16 18:11:01.917071 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:01.917033 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc890e7-ea43-406d-8342-f6544bc76017-tls-certs\") pod \"model-serving-api-86f7b4b499-hcv89\" (UID: \"6cc890e7-ea43-406d-8342-f6544bc76017\") " pod="kserve/model-serving-api-86f7b4b499-hcv89" Apr 16 18:11:01.917424 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:01.917088 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-696bm\" (UniqueName: \"kubernetes.io/projected/6cc890e7-ea43-406d-8342-f6544bc76017-kube-api-access-696bm\") pod \"model-serving-api-86f7b4b499-hcv89\" (UID: \"6cc890e7-ea43-406d-8342-f6544bc76017\") " pod="kserve/model-serving-api-86f7b4b499-hcv89" Apr 16 18:11:02.017519 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:02.017486 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc890e7-ea43-406d-8342-f6544bc76017-tls-certs\") pod \"model-serving-api-86f7b4b499-hcv89\" (UID: \"6cc890e7-ea43-406d-8342-f6544bc76017\") " pod="kserve/model-serving-api-86f7b4b499-hcv89" Apr 16 18:11:02.017705 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:02.017543 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-696bm\" (UniqueName: \"kubernetes.io/projected/6cc890e7-ea43-406d-8342-f6544bc76017-kube-api-access-696bm\") pod \"model-serving-api-86f7b4b499-hcv89\" (UID: \"6cc890e7-ea43-406d-8342-f6544bc76017\") " pod="kserve/model-serving-api-86f7b4b499-hcv89" Apr 16 18:11:02.020165 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:02.020129 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc890e7-ea43-406d-8342-f6544bc76017-tls-certs\") pod \"model-serving-api-86f7b4b499-hcv89\" (UID: \"6cc890e7-ea43-406d-8342-f6544bc76017\") " pod="kserve/model-serving-api-86f7b4b499-hcv89" Apr 16 18:11:02.025857 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:02.025823 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-696bm\" (UniqueName: \"kubernetes.io/projected/6cc890e7-ea43-406d-8342-f6544bc76017-kube-api-access-696bm\") pod \"model-serving-api-86f7b4b499-hcv89\" (UID: \"6cc890e7-ea43-406d-8342-f6544bc76017\") " pod="kserve/model-serving-api-86f7b4b499-hcv89" Apr 16 18:11:02.169434 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:02.169332 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-hcv89" Apr 16 18:11:02.294015 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:11:02.293984 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cc890e7_ea43_406d_8342_f6544bc76017.slice/crio-fdbabc2439f3e6e2c34cb6585df065742c5628c916ddff9da6be83c24346faa5 WatchSource:0}: Error finding container fdbabc2439f3e6e2c34cb6585df065742c5628c916ddff9da6be83c24346faa5: Status 404 returned error can't find the container with id fdbabc2439f3e6e2c34cb6585df065742c5628c916ddff9da6be83c24346faa5 Apr 16 18:11:02.295960 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:02.295942 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-hcv89"] Apr 16 18:11:03.023027 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:03.022991 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-hcv89" event={"ID":"6cc890e7-ea43-406d-8342-f6544bc76017","Type":"ContainerStarted","Data":"fdbabc2439f3e6e2c34cb6585df065742c5628c916ddff9da6be83c24346faa5"} Apr 16 18:11:04.027119 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:04.027084 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-hcv89" event={"ID":"6cc890e7-ea43-406d-8342-f6544bc76017","Type":"ContainerStarted","Data":"a145833f3cbf5104faaf56eaea0c2cc745be03588029a7e2addf4255538268f9"} Apr 16 18:11:04.027503 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:04.027214 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-hcv89" Apr 16 18:11:04.046779 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:04.046737 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-hcv89" podStartSLOduration=1.457064761 podStartE2EDuration="3.04672378s" podCreationTimestamp="2026-04-16 18:11:01 +0000 UTC" firstStartedPulling="2026-04-16 18:11:02.296019659 +0000 UTC m=+542.573433532" lastFinishedPulling="2026-04-16 18:11:03.88567868 +0000 UTC m=+544.163092551" observedRunningTime="2026-04-16 18:11:04.045350509 +0000 UTC m=+544.322764407" watchObservedRunningTime="2026-04-16 18:11:04.04672378 +0000 UTC m=+544.324137674" Apr 16 18:11:15.034440 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:11:15.034413 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-hcv89" Apr 16 18:12:00.204484 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:12:00.204455 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-gssjt_dc84b07b-78de-4ae8-bb54-38e289deb5f1/console-operator/1.log" Apr 16 18:12:00.205534 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:12:00.205508 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-gssjt_dc84b07b-78de-4ae8-bb54-38e289deb5f1/console-operator/1.log" Apr 16 18:14:44.280839 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:44.280802 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf"] Apr 16 18:14:44.284445 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:44.284422 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" Apr 16 18:14:44.287540 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:44.287511 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hlvv7\"" Apr 16 18:14:44.287685 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:44.287515 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-bfe16-serving-cert\"" Apr 16 18:14:44.287685 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:44.287523 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:14:44.287815 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:44.287782 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-bfe16-kube-rbac-proxy-sar-config\"" Apr 16 18:14:44.306212 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:44.306190 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf"] Apr 16 18:14:44.387227 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:44.387202 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/03e3ba9c-179e-4a92-a842-bef9588d4b13-proxy-tls\") pod \"model-chainer-raw-bfe16-554c8fd6df-dwhzf\" (UID: \"03e3ba9c-179e-4a92-a842-bef9588d4b13\") " pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" Apr 16 18:14:44.387329 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:44.387258 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03e3ba9c-179e-4a92-a842-bef9588d4b13-openshift-service-ca-bundle\") pod \"model-chainer-raw-bfe16-554c8fd6df-dwhzf\" (UID: \"03e3ba9c-179e-4a92-a842-bef9588d4b13\") " pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" Apr 16 18:14:44.488307 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:44.488286 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03e3ba9c-179e-4a92-a842-bef9588d4b13-openshift-service-ca-bundle\") pod \"model-chainer-raw-bfe16-554c8fd6df-dwhzf\" (UID: \"03e3ba9c-179e-4a92-a842-bef9588d4b13\") " pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" Apr 16 18:14:44.488469 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:44.488430 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/03e3ba9c-179e-4a92-a842-bef9588d4b13-proxy-tls\") pod \"model-chainer-raw-bfe16-554c8fd6df-dwhzf\" (UID: \"03e3ba9c-179e-4a92-a842-bef9588d4b13\") " pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" Apr 16 18:14:44.488963 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:44.488936 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03e3ba9c-179e-4a92-a842-bef9588d4b13-openshift-service-ca-bundle\") pod \"model-chainer-raw-bfe16-554c8fd6df-dwhzf\" (UID: \"03e3ba9c-179e-4a92-a842-bef9588d4b13\") " pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" Apr 16 18:14:44.491404 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:44.491355 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/03e3ba9c-179e-4a92-a842-bef9588d4b13-proxy-tls\") pod \"model-chainer-raw-bfe16-554c8fd6df-dwhzf\" (UID: \"03e3ba9c-179e-4a92-a842-bef9588d4b13\") " pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" Apr 16 18:14:44.597760 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:44.597697 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" Apr 16 18:14:44.928729 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:44.928703 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf"] Apr 16 18:14:44.932504 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:14:44.932476 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03e3ba9c_179e_4a92_a842_bef9588d4b13.slice/crio-666b6f640ceffdf5a2fdf75b90225ab5a338afcf2c749f273c709276be2f1275 WatchSource:0}: Error finding container 666b6f640ceffdf5a2fdf75b90225ab5a338afcf2c749f273c709276be2f1275: Status 404 returned error can't find the container with id 666b6f640ceffdf5a2fdf75b90225ab5a338afcf2c749f273c709276be2f1275 Apr 16 18:14:44.934191 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:44.934176 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:14:45.791263 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:45.791223 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" event={"ID":"03e3ba9c-179e-4a92-a842-bef9588d4b13","Type":"ContainerStarted","Data":"666b6f640ceffdf5a2fdf75b90225ab5a338afcf2c749f273c709276be2f1275"} Apr 16 18:14:47.801320 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:47.801247 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" event={"ID":"03e3ba9c-179e-4a92-a842-bef9588d4b13","Type":"ContainerStarted","Data":"6e7ea2093910634d46bf40408545e8afba751c91c90783bdafbcf63c6098432e"} Apr 16 18:14:47.801681 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:47.801364 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" Apr 16 18:14:47.816260 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:47.816218 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" podStartSLOduration=1.2284832350000001 podStartE2EDuration="3.816205005s" podCreationTimestamp="2026-04-16 18:14:44 +0000 UTC" firstStartedPulling="2026-04-16 18:14:44.934292507 +0000 UTC m=+765.211706379" lastFinishedPulling="2026-04-16 18:14:47.522014274 +0000 UTC m=+767.799428149" observedRunningTime="2026-04-16 18:14:47.815669571 +0000 UTC m=+768.093083469" watchObservedRunningTime="2026-04-16 18:14:47.816205005 +0000 UTC m=+768.093618899" Apr 16 18:14:53.811289 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:53.811260 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" Apr 16 18:14:54.296930 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:54.296902 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf"] Apr 16 18:14:54.297148 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:54.297125 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" podUID="03e3ba9c-179e-4a92-a842-bef9588d4b13" containerName="model-chainer-raw-bfe16" containerID="cri-o://6e7ea2093910634d46bf40408545e8afba751c91c90783bdafbcf63c6098432e" gracePeriod=30 Apr 16 18:14:58.809899 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:14:58.809842 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" podUID="03e3ba9c-179e-4a92-a842-bef9588d4b13" containerName="model-chainer-raw-bfe16" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:15:03.810383 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:03.810334 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" podUID="03e3ba9c-179e-4a92-a842-bef9588d4b13" containerName="model-chainer-raw-bfe16" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:15:08.809326 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:08.809289 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" podUID="03e3ba9c-179e-4a92-a842-bef9588d4b13" containerName="model-chainer-raw-bfe16" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:15:08.809744 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:08.809406 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" Apr 16 18:15:13.809200 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:13.809162 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" podUID="03e3ba9c-179e-4a92-a842-bef9588d4b13" containerName="model-chainer-raw-bfe16" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:15:18.809463 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:18.809382 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" podUID="03e3ba9c-179e-4a92-a842-bef9588d4b13" containerName="model-chainer-raw-bfe16" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:15:23.809867 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:23.809832 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" podUID="03e3ba9c-179e-4a92-a842-bef9588d4b13" containerName="model-chainer-raw-bfe16" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:15:24.322213 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:15:24.322181 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03e3ba9c_179e_4a92_a842_bef9588d4b13.slice/crio-conmon-6e7ea2093910634d46bf40408545e8afba751c91c90783bdafbcf63c6098432e.scope\": RecentStats: unable to find data in memory cache]" Apr 16 18:15:24.322341 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:15:24.322263 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03e3ba9c_179e_4a92_a842_bef9588d4b13.slice/crio-6e7ea2093910634d46bf40408545e8afba751c91c90783bdafbcf63c6098432e.scope\": RecentStats: unable to find data in memory cache]" Apr 16 18:15:24.448920 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:24.448896 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" Apr 16 18:15:24.584913 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:24.584846 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/03e3ba9c-179e-4a92-a842-bef9588d4b13-proxy-tls\") pod \"03e3ba9c-179e-4a92-a842-bef9588d4b13\" (UID: \"03e3ba9c-179e-4a92-a842-bef9588d4b13\") " Apr 16 18:15:24.584913 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:24.584893 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03e3ba9c-179e-4a92-a842-bef9588d4b13-openshift-service-ca-bundle\") pod \"03e3ba9c-179e-4a92-a842-bef9588d4b13\" (UID: \"03e3ba9c-179e-4a92-a842-bef9588d4b13\") " Apr 16 18:15:24.585294 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:24.585266 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03e3ba9c-179e-4a92-a842-bef9588d4b13-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "03e3ba9c-179e-4a92-a842-bef9588d4b13" (UID: "03e3ba9c-179e-4a92-a842-bef9588d4b13"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:15:24.586964 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:24.586940 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e3ba9c-179e-4a92-a842-bef9588d4b13-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "03e3ba9c-179e-4a92-a842-bef9588d4b13" (UID: "03e3ba9c-179e-4a92-a842-bef9588d4b13"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:15:24.685855 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:24.685830 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/03e3ba9c-179e-4a92-a842-bef9588d4b13-proxy-tls\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:15:24.685855 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:24.685855 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03e3ba9c-179e-4a92-a842-bef9588d4b13-openshift-service-ca-bundle\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:15:24.930967 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:24.930933 2574 generic.go:358] "Generic (PLEG): container finished" podID="03e3ba9c-179e-4a92-a842-bef9588d4b13" containerID="6e7ea2093910634d46bf40408545e8afba751c91c90783bdafbcf63c6098432e" exitCode=0 Apr 16 18:15:24.931384 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:24.931007 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" Apr 16 18:15:24.931384 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:24.931008 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" event={"ID":"03e3ba9c-179e-4a92-a842-bef9588d4b13","Type":"ContainerDied","Data":"6e7ea2093910634d46bf40408545e8afba751c91c90783bdafbcf63c6098432e"} Apr 16 18:15:24.931384 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:24.931113 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf" event={"ID":"03e3ba9c-179e-4a92-a842-bef9588d4b13","Type":"ContainerDied","Data":"666b6f640ceffdf5a2fdf75b90225ab5a338afcf2c749f273c709276be2f1275"} Apr 16 18:15:24.931384 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:24.931141 2574 scope.go:117] "RemoveContainer" containerID="6e7ea2093910634d46bf40408545e8afba751c91c90783bdafbcf63c6098432e" Apr 16 18:15:24.939923 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:24.939906 2574 scope.go:117] "RemoveContainer" containerID="6e7ea2093910634d46bf40408545e8afba751c91c90783bdafbcf63c6098432e" Apr 16 18:15:24.940184 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:15:24.940167 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e7ea2093910634d46bf40408545e8afba751c91c90783bdafbcf63c6098432e\": container with ID starting with 6e7ea2093910634d46bf40408545e8afba751c91c90783bdafbcf63c6098432e not found: ID does not exist" containerID="6e7ea2093910634d46bf40408545e8afba751c91c90783bdafbcf63c6098432e" Apr 16 18:15:24.940249 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:24.940191 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e7ea2093910634d46bf40408545e8afba751c91c90783bdafbcf63c6098432e"} err="failed to get container status \"6e7ea2093910634d46bf40408545e8afba751c91c90783bdafbcf63c6098432e\": rpc error: code = NotFound desc = could not find container \"6e7ea2093910634d46bf40408545e8afba751c91c90783bdafbcf63c6098432e\": container with ID starting with 6e7ea2093910634d46bf40408545e8afba751c91c90783bdafbcf63c6098432e not found: ID does not exist" Apr 16 18:15:24.953038 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:24.953009 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf"] Apr 16 18:15:24.956724 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:24.956704 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-bfe16-554c8fd6df-dwhzf"] Apr 16 18:15:26.295692 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:15:26.295658 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03e3ba9c-179e-4a92-a842-bef9588d4b13" path="/var/lib/kubelet/pods/03e3ba9c-179e-4a92-a842-bef9588d4b13/volumes" Apr 16 18:16:14.525101 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:14.525066 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b"] Apr 16 18:16:14.525600 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:14.525555 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03e3ba9c-179e-4a92-a842-bef9588d4b13" containerName="model-chainer-raw-bfe16" Apr 16 18:16:14.525600 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:14.525573 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="03e3ba9c-179e-4a92-a842-bef9588d4b13" containerName="model-chainer-raw-bfe16" Apr 16 18:16:14.525717 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:14.525670 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="03e3ba9c-179e-4a92-a842-bef9588d4b13" containerName="model-chainer-raw-bfe16" Apr 16 18:16:14.528993 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:14.528974 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" Apr 16 18:16:14.531010 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:14.530990 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-7f673-kube-rbac-proxy-sar-config\"" Apr 16 18:16:14.531161 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:14.531140 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:16:14.531444 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:14.531427 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-7f673-serving-cert\"" Apr 16 18:16:14.531491 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:14.531475 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hlvv7\"" Apr 16 18:16:14.536033 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:14.536013 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b"] Apr 16 18:16:14.555675 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:14.555651 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b3b46f2-a784-47a9-b7dc-d560c2a12973-proxy-tls\") pod \"model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b\" (UID: \"5b3b46f2-a784-47a9-b7dc-d560c2a12973\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" Apr 16 18:16:14.555794 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:14.555710 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b3b46f2-a784-47a9-b7dc-d560c2a12973-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b\" (UID: \"5b3b46f2-a784-47a9-b7dc-d560c2a12973\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" Apr 16 18:16:14.656050 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:14.656029 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b3b46f2-a784-47a9-b7dc-d560c2a12973-proxy-tls\") pod \"model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b\" (UID: \"5b3b46f2-a784-47a9-b7dc-d560c2a12973\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" Apr 16 18:16:14.656137 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:14.656077 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b3b46f2-a784-47a9-b7dc-d560c2a12973-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b\" (UID: \"5b3b46f2-a784-47a9-b7dc-d560c2a12973\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" Apr 16 18:16:14.656189 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:16:14.656168 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-serving-cert: secret "model-chainer-raw-hpa-7f673-serving-cert" not found Apr 16 18:16:14.656252 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:16:14.656241 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b3b46f2-a784-47a9-b7dc-d560c2a12973-proxy-tls podName:5b3b46f2-a784-47a9-b7dc-d560c2a12973 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:15.156222375 +0000 UTC m=+855.433636252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5b3b46f2-a784-47a9-b7dc-d560c2a12973-proxy-tls") pod "model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" (UID: "5b3b46f2-a784-47a9-b7dc-d560c2a12973") : secret "model-chainer-raw-hpa-7f673-serving-cert" not found Apr 16 18:16:14.656746 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:14.656729 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b3b46f2-a784-47a9-b7dc-d560c2a12973-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b\" (UID: \"5b3b46f2-a784-47a9-b7dc-d560c2a12973\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" Apr 16 18:16:15.159506 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:15.159468 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b3b46f2-a784-47a9-b7dc-d560c2a12973-proxy-tls\") pod \"model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b\" (UID: \"5b3b46f2-a784-47a9-b7dc-d560c2a12973\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" Apr 16 18:16:15.161774 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:15.161744 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b3b46f2-a784-47a9-b7dc-d560c2a12973-proxy-tls\") pod \"model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b\" (UID: \"5b3b46f2-a784-47a9-b7dc-d560c2a12973\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" Apr 16 18:16:15.440472 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:15.440443 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" Apr 16 18:16:15.558563 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:15.558539 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b"] Apr 16 18:16:15.561358 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:16:15.561321 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b3b46f2_a784_47a9_b7dc_d560c2a12973.slice/crio-559dd71eca3da25021b387c150bc1398da78e599c10bede47d70f4b0574e2686 WatchSource:0}: Error finding container 559dd71eca3da25021b387c150bc1398da78e599c10bede47d70f4b0574e2686: Status 404 returned error can't find the container with id 559dd71eca3da25021b387c150bc1398da78e599c10bede47d70f4b0574e2686 Apr 16 18:16:16.113030 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:16.112987 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" event={"ID":"5b3b46f2-a784-47a9-b7dc-d560c2a12973","Type":"ContainerStarted","Data":"a7d8b35592d547dc00aa8cae6e6c7845dc2a5aadda8705fb14691f0e5041166f"} Apr 16 18:16:16.113248 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:16.113040 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" event={"ID":"5b3b46f2-a784-47a9-b7dc-d560c2a12973","Type":"ContainerStarted","Data":"559dd71eca3da25021b387c150bc1398da78e599c10bede47d70f4b0574e2686"} Apr 16 18:16:16.113248 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:16.113082 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" Apr 16 18:16:16.128674 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:16.128624 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" podStartSLOduration=2.128605914 podStartE2EDuration="2.128605914s" podCreationTimestamp="2026-04-16 18:16:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:16:16.12719767 +0000 UTC m=+856.404611565" watchObservedRunningTime="2026-04-16 18:16:16.128605914 +0000 UTC m=+856.406019810" Apr 16 18:16:22.123216 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:22.123186 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" Apr 16 18:16:24.580671 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:24.580629 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b"] Apr 16 18:16:24.581080 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:24.580857 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" podUID="5b3b46f2-a784-47a9-b7dc-d560c2a12973" containerName="model-chainer-raw-hpa-7f673" containerID="cri-o://a7d8b35592d547dc00aa8cae6e6c7845dc2a5aadda8705fb14691f0e5041166f" gracePeriod=30 Apr 16 18:16:27.122431 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:27.122393 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" podUID="5b3b46f2-a784-47a9-b7dc-d560c2a12973" containerName="model-chainer-raw-hpa-7f673" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:16:32.121086 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:32.121040 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" podUID="5b3b46f2-a784-47a9-b7dc-d560c2a12973" containerName="model-chainer-raw-hpa-7f673" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:16:37.121152 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:37.121107 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" podUID="5b3b46f2-a784-47a9-b7dc-d560c2a12973" containerName="model-chainer-raw-hpa-7f673" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:16:37.121551 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:37.121216 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" Apr 16 18:16:42.121678 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:42.121631 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" podUID="5b3b46f2-a784-47a9-b7dc-d560c2a12973" containerName="model-chainer-raw-hpa-7f673" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:16:47.121504 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:47.121420 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" podUID="5b3b46f2-a784-47a9-b7dc-d560c2a12973" containerName="model-chainer-raw-hpa-7f673" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:16:52.121432 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:52.121385 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" podUID="5b3b46f2-a784-47a9-b7dc-d560c2a12973" containerName="model-chainer-raw-hpa-7f673" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:16:54.723324 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:54.723304 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" Apr 16 18:16:54.854348 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:54.854284 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b3b46f2-a784-47a9-b7dc-d560c2a12973-proxy-tls\") pod \"5b3b46f2-a784-47a9-b7dc-d560c2a12973\" (UID: \"5b3b46f2-a784-47a9-b7dc-d560c2a12973\") " Apr 16 18:16:54.854506 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:54.854420 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b3b46f2-a784-47a9-b7dc-d560c2a12973-openshift-service-ca-bundle\") pod \"5b3b46f2-a784-47a9-b7dc-d560c2a12973\" (UID: \"5b3b46f2-a784-47a9-b7dc-d560c2a12973\") " Apr 16 18:16:54.854713 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:54.854693 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b3b46f2-a784-47a9-b7dc-d560c2a12973-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "5b3b46f2-a784-47a9-b7dc-d560c2a12973" (UID: "5b3b46f2-a784-47a9-b7dc-d560c2a12973"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:16:54.856293 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:54.856270 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b3b46f2-a784-47a9-b7dc-d560c2a12973-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5b3b46f2-a784-47a9-b7dc-d560c2a12973" (UID: "5b3b46f2-a784-47a9-b7dc-d560c2a12973"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:16:54.955638 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:54.955606 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b3b46f2-a784-47a9-b7dc-d560c2a12973-proxy-tls\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:16:54.955638 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:54.955637 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b3b46f2-a784-47a9-b7dc-d560c2a12973-openshift-service-ca-bundle\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 16 18:16:55.256102 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:55.256063 2574 generic.go:358] "Generic (PLEG): container finished" podID="5b3b46f2-a784-47a9-b7dc-d560c2a12973" containerID="a7d8b35592d547dc00aa8cae6e6c7845dc2a5aadda8705fb14691f0e5041166f" exitCode=0 Apr 16 18:16:55.256259 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:55.256150 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" event={"ID":"5b3b46f2-a784-47a9-b7dc-d560c2a12973","Type":"ContainerDied","Data":"a7d8b35592d547dc00aa8cae6e6c7845dc2a5aadda8705fb14691f0e5041166f"} Apr 16 18:16:55.256259 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:55.256184 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" Apr 16 18:16:55.256259 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:55.256199 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b" event={"ID":"5b3b46f2-a784-47a9-b7dc-d560c2a12973","Type":"ContainerDied","Data":"559dd71eca3da25021b387c150bc1398da78e599c10bede47d70f4b0574e2686"} Apr 16 18:16:55.256259 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:55.256220 2574 scope.go:117] "RemoveContainer" containerID="a7d8b35592d547dc00aa8cae6e6c7845dc2a5aadda8705fb14691f0e5041166f" Apr 16 18:16:55.264951 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:55.264936 2574 scope.go:117] "RemoveContainer" containerID="a7d8b35592d547dc00aa8cae6e6c7845dc2a5aadda8705fb14691f0e5041166f" Apr 16 18:16:55.265210 ip-10-0-138-15 kubenswrapper[2574]: E0416 18:16:55.265190 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7d8b35592d547dc00aa8cae6e6c7845dc2a5aadda8705fb14691f0e5041166f\": container with ID starting with a7d8b35592d547dc00aa8cae6e6c7845dc2a5aadda8705fb14691f0e5041166f not found: ID does not exist" containerID="a7d8b35592d547dc00aa8cae6e6c7845dc2a5aadda8705fb14691f0e5041166f" Apr 16 18:16:55.265272 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:55.265216 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d8b35592d547dc00aa8cae6e6c7845dc2a5aadda8705fb14691f0e5041166f"} err="failed to get container status \"a7d8b35592d547dc00aa8cae6e6c7845dc2a5aadda8705fb14691f0e5041166f\": rpc error: code = NotFound desc = could not find container \"a7d8b35592d547dc00aa8cae6e6c7845dc2a5aadda8705fb14691f0e5041166f\": container with ID starting with a7d8b35592d547dc00aa8cae6e6c7845dc2a5aadda8705fb14691f0e5041166f not found: ID does not exist" Apr 16 18:16:55.276460 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:55.276439 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b"] Apr 16 18:16:55.280940 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:55.280921 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-7f673-84b7cdb857-dfs5b"] Apr 16 18:16:56.295896 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:16:56.295858 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b3b46f2-a784-47a9-b7dc-d560c2a12973" path="/var/lib/kubelet/pods/5b3b46f2-a784-47a9-b7dc-d560c2a12973/volumes" Apr 16 18:17:00.228407 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:17:00.228383 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-gssjt_dc84b07b-78de-4ae8-bb54-38e289deb5f1/console-operator/1.log" Apr 16 18:17:00.230114 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:17:00.230094 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-gssjt_dc84b07b-78de-4ae8-bb54-38e289deb5f1/console-operator/1.log" Apr 16 18:22:00.254590 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:22:00.254557 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-gssjt_dc84b07b-78de-4ae8-bb54-38e289deb5f1/console-operator/1.log" Apr 16 18:22:00.257110 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:22:00.257087 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-gssjt_dc84b07b-78de-4ae8-bb54-38e289deb5f1/console-operator/1.log" Apr 16 18:24:28.173821 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:28.173796 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gtd4f_5b5f0a92-12a8-4b64-989f-c8db774a38c1/global-pull-secret-syncer/0.log" Apr 16 18:24:28.219258 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:28.219228 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-5h65c_276ccc08-ca8c-4aca-9ce7-0dcbf63c4c4c/konnectivity-agent/0.log" Apr 16 18:24:28.353833 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:28.353813 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-15.ec2.internal_ede70444dfe8774ee4b88dac28542853/haproxy/0.log" Apr 16 18:24:32.019810 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:32.019778 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7a81f23d-53f8-4572-bd35-b6bda49d0423/alertmanager/0.log" Apr 16 18:24:32.049655 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:32.049628 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7a81f23d-53f8-4572-bd35-b6bda49d0423/config-reloader/0.log" Apr 16 18:24:32.081196 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:32.081172 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7a81f23d-53f8-4572-bd35-b6bda49d0423/kube-rbac-proxy-web/0.log" Apr 16 18:24:32.110523 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:32.110503 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7a81f23d-53f8-4572-bd35-b6bda49d0423/kube-rbac-proxy/0.log" Apr 16 18:24:32.138415 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:32.138392 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7a81f23d-53f8-4572-bd35-b6bda49d0423/kube-rbac-proxy-metric/0.log" Apr 16 18:24:32.163696 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:32.163674 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7a81f23d-53f8-4572-bd35-b6bda49d0423/prom-label-proxy/0.log" Apr 16 18:24:32.188249 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:32.188231 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7a81f23d-53f8-4572-bd35-b6bda49d0423/init-config-reloader/0.log" Apr 16 18:24:32.226827 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:32.226803 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-k7vp9_8b0ce7ac-c20c-4958-8a1f-f646f39de6ab/cluster-monitoring-operator/0.log" Apr 16 18:24:32.339458 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:32.339407 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-fc9cdddd7-svpwj_44d651ae-7188-4e4e-af07-3492f11dc279/metrics-server/0.log" Apr 16 18:24:32.488798 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:32.488777 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kxqds_f65b5dc6-1f16-4b6a-8748-9fe5615c9e68/node-exporter/0.log" Apr 16 18:24:32.509577 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:32.509556 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kxqds_f65b5dc6-1f16-4b6a-8748-9fe5615c9e68/kube-rbac-proxy/0.log" Apr 16 18:24:32.535773 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:32.535759 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kxqds_f65b5dc6-1f16-4b6a-8748-9fe5615c9e68/init-textfile/0.log" Apr 16 18:24:32.649621 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:32.649567 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-bh2pn_791a4644-5779-4690-b410-ae72df65f3bd/kube-rbac-proxy-main/0.log" Apr 16 18:24:32.677755 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:32.677731 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-bh2pn_791a4644-5779-4690-b410-ae72df65f3bd/kube-rbac-proxy-self/0.log" Apr 16 18:24:32.701962 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:32.701943 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-bh2pn_791a4644-5779-4690-b410-ae72df65f3bd/openshift-state-metrics/0.log" Apr 16 18:24:33.033809 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:33.033783 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8684f79487-sfltp_01950832-b73a-42d3-a253-92a4760be179/telemeter-client/0.log" Apr 16 18:24:33.075461 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:33.075438 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8684f79487-sfltp_01950832-b73a-42d3-a253-92a4760be179/reload/0.log" Apr 16 18:24:33.111172 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:33.111150 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8684f79487-sfltp_01950832-b73a-42d3-a253-92a4760be179/kube-rbac-proxy/0.log" Apr 16 18:24:33.167783 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:33.167767 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-566b7d494-ppmb4_3871267c-4368-4f90-9ba9-a4d4b8345f94/thanos-query/0.log" Apr 16 18:24:33.198492 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:33.198475 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-566b7d494-ppmb4_3871267c-4368-4f90-9ba9-a4d4b8345f94/kube-rbac-proxy-web/0.log" Apr 16 18:24:33.226615 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:33.226599 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-566b7d494-ppmb4_3871267c-4368-4f90-9ba9-a4d4b8345f94/kube-rbac-proxy/0.log" Apr 16 18:24:33.253711 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:33.253696 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-566b7d494-ppmb4_3871267c-4368-4f90-9ba9-a4d4b8345f94/prom-label-proxy/0.log" Apr 16 18:24:33.283063 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:33.283047 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-566b7d494-ppmb4_3871267c-4368-4f90-9ba9-a4d4b8345f94/kube-rbac-proxy-rules/0.log" Apr 16 18:24:33.311969 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:33.311917 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-566b7d494-ppmb4_3871267c-4368-4f90-9ba9-a4d4b8345f94/kube-rbac-proxy-metrics/0.log" Apr 16 18:24:34.321542 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:34.321518 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-kgn8f_5e4633a3-2eb6-4ec0-af69-c94c4487082b/networking-console-plugin/0.log" Apr 16 18:24:34.710182 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:34.710154 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-gssjt_dc84b07b-78de-4ae8-bb54-38e289deb5f1/console-operator/1.log" Apr 16 18:24:34.715790 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:34.715769 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-gssjt_dc84b07b-78de-4ae8-bb54-38e289deb5f1/console-operator/2.log" Apr 16 18:24:34.931945 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:34.931919 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68"] Apr 16 18:24:34.932273 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:34.932260 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b3b46f2-a784-47a9-b7dc-d560c2a12973" containerName="model-chainer-raw-hpa-7f673" Apr 16 18:24:34.932318 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:34.932275 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3b46f2-a784-47a9-b7dc-d560c2a12973" containerName="model-chainer-raw-hpa-7f673" Apr 16 18:24:34.932352 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:34.932344 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b3b46f2-a784-47a9-b7dc-d560c2a12973" containerName="model-chainer-raw-hpa-7f673" Apr 16 18:24:34.935342 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:34.935326 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:34.942028 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:34.942000 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4f7nz\"/\"kube-root-ca.crt\"" Apr 16 18:24:34.942557 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:34.942532 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-4f7nz\"/\"default-dockercfg-h5b95\"" Apr 16 18:24:34.942959 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:34.942941 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4f7nz\"/\"openshift-service-ca.crt\"" Apr 16 18:24:34.945649 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:34.945626 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68"] Apr 16 18:24:35.044200 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.044138 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0-lib-modules\") pod \"perf-node-gather-daemonset-ltk68\" (UID: \"7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:35.044200 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.044179 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0-sys\") pod \"perf-node-gather-daemonset-ltk68\" (UID: \"7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:35.044200 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.044203 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0-proc\") pod \"perf-node-gather-daemonset-ltk68\" (UID: \"7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:35.044360 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.044284 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0-podres\") pod \"perf-node-gather-daemonset-ltk68\" (UID: \"7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:35.044360 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.044329 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxhmm\" (UniqueName: \"kubernetes.io/projected/7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0-kube-api-access-hxhmm\") pod \"perf-node-gather-daemonset-ltk68\" (UID: \"7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:35.144789 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.144762 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0-podres\") pod \"perf-node-gather-daemonset-ltk68\" (UID: \"7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:35.144889 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.144799 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxhmm\" (UniqueName: \"kubernetes.io/projected/7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0-kube-api-access-hxhmm\") pod \"perf-node-gather-daemonset-ltk68\" (UID: \"7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:35.144889 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.144842 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0-lib-modules\") pod \"perf-node-gather-daemonset-ltk68\" (UID: \"7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:35.144889 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.144882 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0-sys\") pod \"perf-node-gather-daemonset-ltk68\" (UID: \"7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:35.145060 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.144916 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0-podres\") pod \"perf-node-gather-daemonset-ltk68\" (UID: \"7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:35.145060 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.144963 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0-sys\") pod \"perf-node-gather-daemonset-ltk68\" (UID: \"7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:35.145060 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.145005 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0-proc\") pod \"perf-node-gather-daemonset-ltk68\" (UID: \"7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:35.145060 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.145009 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0-lib-modules\") pod \"perf-node-gather-daemonset-ltk68\" (UID: \"7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:35.145200 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.145058 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0-proc\") pod \"perf-node-gather-daemonset-ltk68\" (UID: \"7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:35.151936 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.151914 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxhmm\" (UniqueName: \"kubernetes.io/projected/7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0-kube-api-access-hxhmm\") pod \"perf-node-gather-daemonset-ltk68\" (UID: \"7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:35.254108 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.254078 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:35.370418 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.370393 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68"] Apr 16 18:24:35.372671 ip-10-0-138-15 kubenswrapper[2574]: W0416 18:24:35.372641 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7bdc9c32_d456_41cf_a97d_ba39b7a9cdb0.slice/crio-157a6f7b9877110e65695299276f1ef64fa085a7774ef61375cae6bfdc1c04ed WatchSource:0}: Error finding container 157a6f7b9877110e65695299276f1ef64fa085a7774ef61375cae6bfdc1c04ed: Status 404 returned error can't find the container with id 157a6f7b9877110e65695299276f1ef64fa085a7774ef61375cae6bfdc1c04ed Apr 16 18:24:35.374355 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.374339 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:24:35.842881 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.842848 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" event={"ID":"7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0","Type":"ContainerStarted","Data":"f31885938c931dd87f5e1986a1b08b7c6ae52e3ece3babf42c45909bbd650f4e"} Apr 16 18:24:35.843007 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.842891 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" event={"ID":"7bdc9c32-d456-41cf-a97d-ba39b7a9cdb0","Type":"ContainerStarted","Data":"157a6f7b9877110e65695299276f1ef64fa085a7774ef61375cae6bfdc1c04ed"} Apr 16 18:24:35.843007 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.842931 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:35.857723 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:35.857683 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" podStartSLOduration=1.8576704830000002 podStartE2EDuration="1.857670483s" podCreationTimestamp="2026-04-16 18:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:24:35.856272455 +0000 UTC m=+1356.133686363" watchObservedRunningTime="2026-04-16 18:24:35.857670483 +0000 UTC m=+1356.135084374" Apr 16 18:24:36.194667 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:36.194641 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-btqpr_567b1659-cf99-4ad6-aad7-99460710d869/dns/0.log" Apr 16 18:24:36.217444 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:36.217425 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-btqpr_567b1659-cf99-4ad6-aad7-99460710d869/kube-rbac-proxy/0.log" Apr 16 18:24:36.344929 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:36.344906 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2nlc5_2f391f14-a93d-421b-8f8c-642ea53a1269/dns-node-resolver/0.log" Apr 16 18:24:36.799964 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:36.799940 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5ddc6d5d76-tvmvt_97b7c142-016e-4ff4-a801-dab62c30ca70/registry/0.log" Apr 16 18:24:36.820901 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:36.820878 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qwbnl_9d5a1448-8fde-4017-8bb4-c60dbbbe2a1b/node-ca/0.log" Apr 16 18:24:38.011944 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:38.011917 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-z9628_6e8ad75a-91e2-49d1-b5b3-8f18c05f867d/serve-healthcheck-canary/0.log" Apr 16 18:24:38.475643 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:38.475616 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-shshw_84abdb7e-6fd5-451b-ace1-130db696d178/kube-rbac-proxy/0.log" Apr 16 18:24:38.498935 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:38.498914 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-shshw_84abdb7e-6fd5-451b-ace1-130db696d178/exporter/0.log" Apr 16 18:24:38.521910 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:38.521889 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-shshw_84abdb7e-6fd5-451b-ace1-130db696d178/extractor/0.log" Apr 16 18:24:40.523699 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:40.523666 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-659c8cbdc-gxhdn_c593e40a-0664-4c32-ad74-e3fa5a8d18e0/manager/0.log" Apr 16 18:24:40.569318 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:40.569293 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-hcv89_6cc890e7-ea43-406d-8342-f6544bc76017/server/0.log" Apr 16 18:24:41.857295 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:41.857263 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-ltk68" Apr 16 18:24:46.314100 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:46.314019 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-frw6p_1e22989e-67d8-41f9-acfa-874e304428b5/kube-multus-additional-cni-plugins/0.log" Apr 16 18:24:46.338703 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:46.338685 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-frw6p_1e22989e-67d8-41f9-acfa-874e304428b5/egress-router-binary-copy/0.log" Apr 16 18:24:46.365566 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:46.365543 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-frw6p_1e22989e-67d8-41f9-acfa-874e304428b5/cni-plugins/0.log" Apr 16 18:24:46.388301 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:46.388278 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-frw6p_1e22989e-67d8-41f9-acfa-874e304428b5/bond-cni-plugin/0.log" Apr 16 18:24:46.412704 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:46.412685 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-frw6p_1e22989e-67d8-41f9-acfa-874e304428b5/routeoverride-cni/0.log" Apr 16 18:24:46.434618 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:46.434588 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-frw6p_1e22989e-67d8-41f9-acfa-874e304428b5/whereabouts-cni-bincopy/0.log" Apr 16 18:24:46.456292 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:46.456270 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-frw6p_1e22989e-67d8-41f9-acfa-874e304428b5/whereabouts-cni/0.log" Apr 16 18:24:46.886675 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:46.886651 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w2x27_c03c1135-b04a-4662-866b-1180974b6c3e/kube-multus/0.log" Apr 16 18:24:46.912719 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:46.912689 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-24pqv_9d69a3dd-c0fd-4764-b3d8-802189b16640/network-metrics-daemon/0.log" Apr 16 18:24:46.931919 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:46.931865 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-24pqv_9d69a3dd-c0fd-4764-b3d8-802189b16640/kube-rbac-proxy/0.log" Apr 16 18:24:48.426286 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:48.426253 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mv8js_298a93d5-3bc0-4a9d-9dd0-922e60e48669/ovn-controller/0.log" Apr 16 18:24:48.453487 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:48.453461 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mv8js_298a93d5-3bc0-4a9d-9dd0-922e60e48669/ovn-acl-logging/0.log" Apr 16 18:24:48.475200 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:48.475178 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mv8js_298a93d5-3bc0-4a9d-9dd0-922e60e48669/kube-rbac-proxy-node/0.log" Apr 16 18:24:48.507853 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:48.507830 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mv8js_298a93d5-3bc0-4a9d-9dd0-922e60e48669/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:24:48.531555 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:48.531522 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mv8js_298a93d5-3bc0-4a9d-9dd0-922e60e48669/northd/0.log" Apr 16 18:24:48.554340 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:48.554321 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mv8js_298a93d5-3bc0-4a9d-9dd0-922e60e48669/nbdb/0.log" Apr 16 18:24:48.577874 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:48.577856 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mv8js_298a93d5-3bc0-4a9d-9dd0-922e60e48669/sbdb/0.log" Apr 16 18:24:48.683858 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:48.683802 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mv8js_298a93d5-3bc0-4a9d-9dd0-922e60e48669/ovnkube-controller/0.log" Apr 16 18:24:49.988384 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:49.988342 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-bmn9v_d7f5c41a-d5cd-4717-b0c7-464ae63f0d81/check-endpoints/0.log" Apr 16 18:24:50.068219 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:50.068193 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-tkqz7_299ac103-77d7-4ae3-b981-8c66d39e67eb/network-check-target-container/0.log" Apr 16 18:24:51.023431 ip-10-0-138-15 kubenswrapper[2574]: I0416 18:24:51.023403 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-v9kdc_64cbfc80-dc30-49cf-a0eb-68130da967eb/iptables-alerter/0.log"