Apr 17 16:31:28.338324 ip-10-0-138-170 systemd[1]: Starting Kubernetes Kubelet... Apr 17 16:31:28.884323 ip-10-0-138-170 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:28.884323 ip-10-0-138-170 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 16:31:28.884323 ip-10-0-138-170 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:28.884323 ip-10-0-138-170 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 16:31:28.884323 ip-10-0-138-170 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:28.885227 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.885146 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 16:31:28.888173 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888159 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:28.888173 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888174 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888178 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888182 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888184 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888187 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888190 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888193 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888196 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888198 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888201 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888209 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888211 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888214 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888216 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888220 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888222 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888226 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888230 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888233 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:28.888241 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888236 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888239 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888241 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888244 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888247 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888250 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888252 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888256 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888260 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888263 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888266 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888268 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888271 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888273 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888276 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888278 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888281 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888283 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888286 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888288 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:28.888694 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888291 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:28.889224 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888293 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:28.889224 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888296 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:28.889224 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888298 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:28.889224 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888301 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:28.889224 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888303 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:28.889224 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.888306 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:28.890655 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890646 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:28.890655 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890655 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890659 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890662 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890665 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890668 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890671 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890674 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890677 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890680 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890682 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890685 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890690 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890692 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890695 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890698 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890700 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890702 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890705 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890710 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890712 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:28.890715 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890715 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890718 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890720 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890723 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890726 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890729 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890731 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890734 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890736 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890738 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890742 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890745 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890747 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890750 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890752 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890755 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890757 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.890760 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891172 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891179 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:28.891229 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891182 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891185 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891188 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891191 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891193 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891196 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891199 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891201 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891204 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891206 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891209 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891212 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891215 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891217 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891220 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891223 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891226 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891228 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891231 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891234 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:28.891746 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891236 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891239 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891241 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891244 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891246 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891249 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891251 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891254 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891257 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891259 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891261 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891264 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891266 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891268 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891271 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891273 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891276 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891279 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891282 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891284 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:28.892280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891287 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891289 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891291 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891294 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891297 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891300 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891305 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891308 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891311 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891313 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891316 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891318 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891320 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891323 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891325 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891328 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891330 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891333 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891336 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891338 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:28.892805 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891341 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891343 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891348 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891351 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891354 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891357 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891359 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891377 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891381 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891384 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891387 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891389 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891392 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891394 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891397 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891399 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891402 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891405 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891409 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:28.893344 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891412 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891414 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891417 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891420 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.891422 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892573 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892582 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892589 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892593 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892598 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892601 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892605 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892610 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892613 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892616 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892620 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892623 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892627 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892629 2578 flags.go:64] FLAG: --cgroup-root="" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892632 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892635 2578 flags.go:64] FLAG: --client-ca-file="" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892638 2578 flags.go:64] FLAG: --cloud-config="" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892640 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 17 16:31:28.893807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892643 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892647 2578 flags.go:64] FLAG: --cluster-domain="" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892650 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892653 2578 flags.go:64] FLAG: --config-dir="" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892656 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892659 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892663 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892666 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892670 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892673 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892676 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892679 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892682 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892685 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892688 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892692 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892695 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892698 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892701 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892704 2578 flags.go:64] FLAG: --enable-server="true" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892707 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892711 2578 flags.go:64] FLAG: --event-burst="100" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892714 2578 flags.go:64] FLAG: --event-qps="50" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892717 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892720 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 16:31:28.894394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892723 2578 flags.go:64] FLAG: --eviction-hard="" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892727 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892730 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892732 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892736 2578 flags.go:64] FLAG: --eviction-soft="" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892738 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892741 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892744 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892746 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892749 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892752 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892756 2578 flags.go:64] FLAG: --feature-gates="" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892760 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892763 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892766 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892770 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892773 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892776 2578 flags.go:64] FLAG: --help="false" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892779 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-138-170.ec2.internal" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892782 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892786 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892788 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892792 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 16:31:28.894991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892795 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892797 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892800 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892803 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892805 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892808 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892812 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892814 2578 flags.go:64] FLAG: --kube-reserved="" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892817 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892821 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892824 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892827 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892829 2578 flags.go:64] FLAG: --lock-file="" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892832 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892835 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892838 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892842 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892845 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892848 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892851 2578 flags.go:64] FLAG: --logging-format="text" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892854 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892857 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892860 2578 flags.go:64] FLAG: --manifest-url="" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892863 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892868 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 16:31:28.895576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892871 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892880 2578 flags.go:64] FLAG: --max-pods="110" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892883 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892886 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892889 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892892 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892895 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892897 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892900 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892907 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892914 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892917 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892920 2578 flags.go:64] FLAG: --pod-cidr="" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892923 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892929 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892931 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892935 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892938 2578 flags.go:64] FLAG: --port="10250" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892941 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892944 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04676a81ed39e442c" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892947 2578 flags.go:64] FLAG: --qos-reserved="" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892950 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892953 2578 flags.go:64] FLAG: --register-node="true" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892955 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 17 16:31:28.896193 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892958 2578 flags.go:64] FLAG: --register-with-taints="" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892962 2578 flags.go:64] FLAG: --registry-burst="10" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892965 2578 flags.go:64] FLAG: --registry-qps="5" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892968 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892971 2578 flags.go:64] FLAG: --reserved-memory="" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892975 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892978 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892981 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892984 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892987 2578 flags.go:64] FLAG: --runonce="false" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892990 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892993 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892996 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.892999 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893002 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893005 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893008 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893011 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893013 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893018 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893021 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893024 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893027 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893030 2578 flags.go:64] FLAG: --system-cgroups="" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893032 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 16:31:28.896775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893038 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893040 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893043 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893047 2578 flags.go:64] FLAG: --tls-min-version="" Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893050 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893052 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893055 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893058 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893061 2578 flags.go:64] FLAG: --v="2" Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893082 2578 flags.go:64] FLAG: --version="false" Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893088 2578 flags.go:64] FLAG: --vmodule="" Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893095 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.893100 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893207 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893212 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893215 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893218 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893221 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893224 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893227 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893230 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893235 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893238 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:28.897385 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893240 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893243 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893246 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893249 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893252 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893255 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893258 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893261 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893264 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893266 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893269 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893271 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893274 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893276 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893279 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893281 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893284 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893286 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893288 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893291 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:28.897933 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893293 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893296 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893298 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893301 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893304 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893308 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893311 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893314 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893316 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893318 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893322 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893324 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893327 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893330 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893332 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893336 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893338 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893341 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893343 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893346 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:28.898490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893349 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893351 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893354 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893357 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893359 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893361 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893364 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893367 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893370 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893374 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893377 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893380 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893382 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893385 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893387 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893390 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893393 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893395 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893397 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:28.898960 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893400 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:28.899483 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893405 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:28.899483 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893408 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:28.899483 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893411 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:28.899483 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893413 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:28.899483 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893416 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:28.899483 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893419 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:28.899483 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893421 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:28.899483 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893425 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:28.899483 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893427 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:28.899483 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893430 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:28.899483 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893432 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:28.899483 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893435 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:28.899483 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893438 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:28.899483 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893440 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:28.899483 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893443 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:28.899483 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.893445 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:28.899887 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.894105 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:28.900771 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.900755 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 16:31:28.900806 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.900773 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 16:31:28.900836 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900822 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:28.900836 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900828 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:28.900836 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900831 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:28.900836 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900834 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:28.900836 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900837 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900840 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900843 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900846 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900849 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900852 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900855 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900857 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900860 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900864 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900868 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900873 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900878 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900883 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900886 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900888 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900891 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900893 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900896 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900898 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:28.900962 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900901 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900903 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900905 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900908 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900910 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900913 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900915 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900918 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900920 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900922 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900925 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900927 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900929 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900932 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900935 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900938 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900941 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900944 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900948 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900953 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:28.901471 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900957 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900960 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900963 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900966 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900969 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900971 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900974 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900976 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900979 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900981 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900984 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900986 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900988 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900991 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900993 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.900997 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901001 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901003 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901006 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:28.901946 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901008 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901011 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901013 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901016 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901018 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901022 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901026 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901031 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901034 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901036 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901039 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901041 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901044 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901046 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901049 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901051 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901058 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901060 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901079 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:28.902440 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901083 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:28.902926 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901088 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:28.902926 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901092 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:28.902926 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901095 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:28.902926 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.901100 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:28.902926 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901201 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:28.902926 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901206 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:28.902926 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901209 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:28.902926 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901211 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:28.902926 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901214 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:28.902926 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901216 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:28.902926 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901219 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:28.902926 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901221 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:28.902926 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901224 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:28.902926 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901226 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:28.902926 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901229 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901231 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901234 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901236 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901240 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901245 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901249 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901253 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901257 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901260 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901263 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901265 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901268 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901270 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901272 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901276 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901278 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901281 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901283 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901285 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:28.903307 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901288 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901290 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901293 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901295 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901298 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901300 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901302 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901305 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901307 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901309 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901312 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901315 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901317 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901320 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901324 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901328 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901332 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901337 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901342 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:28.903791 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901346 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901349 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901351 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901354 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901356 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901359 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901361 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901363 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901366 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901369 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901371 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901374 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901376 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901379 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901383 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901385 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901388 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901390 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901393 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:28.904261 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901395 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:28.904768 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901397 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:28.904768 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901400 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:28.904768 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901403 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:28.904768 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901407 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:28.904768 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901411 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:28.904768 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901415 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:28.904768 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901419 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:28.904768 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901423 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:28.904768 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901426 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:28.904768 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901429 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:28.904768 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901431 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:28.904768 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901434 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:28.904768 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901436 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:28.904768 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901439 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:28.904768 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901442 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:28.904768 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901444 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:28.904768 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:28.901446 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:28.905203 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.901451 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:28.905203 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.902228 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 16:31:28.905203 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.904299 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 16:31:28.905291 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.905276 2578 server.go:1019] "Starting client certificate rotation" Apr 17 16:31:28.905402 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.905382 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:28.905453 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.905432 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:28.933748 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.933727 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:28.939972 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.939941 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:28.952489 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.952472 2578 log.go:25] "Validated CRI v1 runtime API" Apr 17 16:31:28.959080 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.959052 2578 log.go:25] "Validated CRI v1 image API" Apr 17 16:31:28.960280 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.960258 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 16:31:28.967594 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.967576 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:28.969130 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.969107 2578 fs.go:135] Filesystem UUIDs: map[16e9dfa1-137b-4cc5-9b45-5ad59fa1f490:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 cd84e423-1c36-447d-91e7-045b2094a359:/dev/nvme0n1p4] Apr 17 16:31:28.969185 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.969131 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 16:31:28.976081 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.975955 2578 manager.go:217] Machine: {Timestamp:2026-04-17 16:31:28.972799225 +0000 UTC m=+0.494376166 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3104220 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec254be90335b21e18ba9d7fdb25e276 SystemUUID:ec254be9-0335-b21e-18ba-9d7fdb25e276 BootID:9dfbed63-e089-4414-9568-1b8e0d5f54b0 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:41:68:ea:f0:83 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:41:68:ea:f0:83 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:da:57:d9:38:cf:c1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 16:31:28.976081 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.976075 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 16:31:28.976183 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.976151 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 16:31:28.976499 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.976475 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 16:31:28.976631 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.976501 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-170.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 16:31:28.976672 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.976640 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 16:31:28.976672 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.976649 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 16:31:28.976672 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.976663 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:28.977630 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.977620 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:28.979881 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.979870 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:28.979980 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.979971 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 16:31:28.983168 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.983156 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 17 16:31:28.983199 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.983178 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 16:31:28.983199 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.983195 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 16:31:28.983248 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.983206 2578 kubelet.go:397] "Adding apiserver pod source" Apr 17 16:31:28.983248 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.983214 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 16:31:28.984905 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.984893 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:28.984941 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.984912 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:28.985305 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.985269 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-r9cr5" Apr 17 16:31:28.987895 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.987878 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 16:31:28.989713 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.989697 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 16:31:28.991099 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.991085 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 16:31:28.991172 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.991107 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 16:31:28.991172 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.991117 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 16:31:28.991172 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.991124 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 16:31:28.991172 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.991133 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 16:31:28.991172 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.991142 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 16:31:28.991172 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.991150 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 16:31:28.991172 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.991158 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 16:31:28.991172 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.991168 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 16:31:28.991428 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.991176 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 16:31:28.991428 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.991204 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 16:31:28.991428 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.991218 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 16:31:28.992712 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.992688 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 16:31:28.992864 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.992841 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 16:31:28.993898 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.993879 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-r9cr5" Apr 17 16:31:28.995362 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:28.995333 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 16:31:28.995774 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:28.995747 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-170.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 16:31:28.996615 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.996593 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-170.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 16:31:28.997631 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.997615 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 16:31:28.997708 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.997667 2578 server.go:1295] "Started kubelet" Apr 17 16:31:28.997775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.997714 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 16:31:28.997825 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.997766 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 16:31:28.997871 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.997828 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 16:31:28.998893 ip-10-0-138-170 systemd[1]: Started Kubernetes Kubelet. Apr 17 16:31:28.999043 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.998896 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 16:31:29.000020 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:28.999998 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 17 16:31:29.009832 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.009809 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:29.010335 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.010305 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 16:31:29.010335 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.010298 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 16:31:29.011030 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.011001 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-170.ec2.internal\" not found" Apr 17 16:31:29.011749 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.011733 2578 factory.go:55] Registering systemd factory Apr 17 16:31:29.011838 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.011796 2578 factory.go:223] Registration of the systemd container factory successfully Apr 17 16:31:29.012023 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.012010 2578 factory.go:153] Registering CRI-O factory Apr 17 16:31:29.012023 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.012022 2578 factory.go:223] Registration of the crio container factory successfully Apr 17 16:31:29.012166 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.012082 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 16:31:29.012166 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.012105 2578 factory.go:103] Registering Raw factory Apr 17 16:31:29.012166 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.012115 2578 manager.go:1196] Started watching for new ooms in manager Apr 17 16:31:29.012166 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.012157 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 16:31:29.012437 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.012423 2578 manager.go:319] Starting recovery of all containers Apr 17 16:31:29.013106 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.013087 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 16:31:29.013176 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.013110 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 16:31:29.013233 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.013222 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 17 16:31:29.013290 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.013234 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 17 16:31:29.013447 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.013431 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:29.016227 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.016033 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-170.ec2.internal\" not found" node="ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.019832 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.019804 2578 manager.go:324] Recovery completed Apr 17 16:31:29.021642 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.021620 2578 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 16:31:29.025362 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.025341 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:29.027768 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.027747 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:29.027849 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.027774 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:29.027849 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.027785 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:29.028253 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.028240 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 16:31:29.028299 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.028254 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 16:31:29.028299 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.028272 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:29.031152 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.031141 2578 policy_none.go:49] "None policy: Start" Apr 17 16:31:29.031186 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.031157 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 16:31:29.031186 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.031166 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 17 16:31:29.070854 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.070839 2578 manager.go:341] "Starting Device Plugin manager" Apr 17 16:31:29.070956 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.070911 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 16:31:29.070956 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.070926 2578 server.go:85] "Starting device plugin registration server" Apr 17 16:31:29.071286 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.071206 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 16:31:29.071286 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.071220 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 16:31:29.071566 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.071359 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 16:31:29.071566 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.071430 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 16:31:29.071566 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.071440 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 16:31:29.072026 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.072000 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 16:31:29.072137 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.072050 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-170.ec2.internal\" not found" Apr 17 16:31:29.163669 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.163605 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 16:31:29.164984 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.164963 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 16:31:29.165090 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.164997 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 16:31:29.165090 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.165017 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 16:31:29.165090 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.165025 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 16:31:29.165090 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.165079 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 16:31:29.167292 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.167269 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:29.172171 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.172155 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:29.173090 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.173054 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:29.173143 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.173110 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:29.173143 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.173126 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:29.173206 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.173153 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.182355 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.182341 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.182403 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.182362 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-170.ec2.internal\": node \"ip-10-0-138-170.ec2.internal\" not found" Apr 17 16:31:29.219324 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.219307 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-170.ec2.internal\" not found" Apr 17 16:31:29.265344 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.265323 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-170.ec2.internal"] Apr 17 16:31:29.265412 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.265402 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:29.266213 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.266199 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:29.266265 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.266227 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:29.266265 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.266238 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:29.267604 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.267592 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:29.267787 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.267773 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.267827 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.267802 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:29.268287 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.268271 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:29.268373 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.268300 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:29.268373 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.268313 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:29.268373 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.268275 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:29.268373 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.268340 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:29.268373 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.268354 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:29.269468 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.269454 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.269518 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.269483 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:29.270117 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.270102 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:29.270199 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.270124 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:29.270199 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.270135 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:29.293640 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.293615 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-170.ec2.internal\" not found" node="ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.297868 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.297853 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-170.ec2.internal\" not found" node="ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.315134 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.315117 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b9559efac8a150a05024d8f64b4bca67-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal\" (UID: \"b9559efac8a150a05024d8f64b4bca67\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.315211 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.315142 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b9559efac8a150a05024d8f64b4bca67-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal\" (UID: \"b9559efac8a150a05024d8f64b4bca67\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.315211 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.315161 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/72d3944a84d00c65c1b4be69187354b2-config\") pod \"kube-apiserver-proxy-ip-10-0-138-170.ec2.internal\" (UID: \"72d3944a84d00c65c1b4be69187354b2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.320182 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.320157 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-170.ec2.internal\" not found" Apr 17 16:31:29.415848 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.415806 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b9559efac8a150a05024d8f64b4bca67-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal\" (UID: \"b9559efac8a150a05024d8f64b4bca67\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.415848 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.415832 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b9559efac8a150a05024d8f64b4bca67-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal\" (UID: \"b9559efac8a150a05024d8f64b4bca67\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.415977 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.415849 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/72d3944a84d00c65c1b4be69187354b2-config\") pod \"kube-apiserver-proxy-ip-10-0-138-170.ec2.internal\" (UID: \"72d3944a84d00c65c1b4be69187354b2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.415977 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.415888 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/72d3944a84d00c65c1b4be69187354b2-config\") pod \"kube-apiserver-proxy-ip-10-0-138-170.ec2.internal\" (UID: \"72d3944a84d00c65c1b4be69187354b2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.415977 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.415915 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b9559efac8a150a05024d8f64b4bca67-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal\" (UID: \"b9559efac8a150a05024d8f64b4bca67\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.415977 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.415935 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b9559efac8a150a05024d8f64b4bca67-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal\" (UID: \"b9559efac8a150a05024d8f64b4bca67\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.420916 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.420901 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-170.ec2.internal\" not found" Apr 17 16:31:29.521539 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.521510 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-170.ec2.internal\" not found" Apr 17 16:31:29.595726 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.595707 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.600452 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.600435 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.622627 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.622603 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-170.ec2.internal\" not found" Apr 17 16:31:29.723229 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.723159 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-170.ec2.internal\" not found" Apr 17 16:31:29.823666 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.823629 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-170.ec2.internal\" not found" Apr 17 16:31:29.853296 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.853275 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:29.906226 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.906206 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 16:31:29.906761 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.906298 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:29.906761 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.906304 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:29.906761 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.906356 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:29.912366 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.912349 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.931522 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.931506 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:29.933725 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.933712 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-170.ec2.internal" Apr 17 16:31:29.942231 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.942213 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:29.984362 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.984305 2578 apiserver.go:52] "Watching apiserver" Apr 17 16:31:29.997398 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.997364 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 16:26:28 +0000 UTC" deadline="2028-02-02 11:00:34.218180136 +0000 UTC" Apr 17 16:31:29.997398 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.997398 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15738h29m4.220784818s" Apr 17 16:31:29.998047 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.998028 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 16:31:29.998399 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.998380 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-598xw","openshift-network-diagnostics/network-check-target-hqwh2","openshift-network-operator/iptables-alerter-f8jht","kube-system/konnectivity-agent-vpndd","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm","openshift-dns/node-resolver-7l9qg","openshift-image-registry/node-ca-5ft4z","openshift-ovn-kubernetes/ovnkube-node-8jknk","kube-system/kube-apiserver-proxy-ip-10-0-138-170.ec2.internal","openshift-cluster-node-tuning-operator/tuned-ljs5t","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal","openshift-multus/multus-additional-cni-plugins-gdzlp","openshift-multus/multus-lg6kr"] Apr 17 16:31:29.999661 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:29.999645 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:29.999744 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:29.999723 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:31:30.001635 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.001617 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:30.001713 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:30.001679 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqwh2" podUID="ffde06b8-a22f-482c-89a5-3fa86598f73d" Apr 17 16:31:30.002855 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.002836 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f8jht" Apr 17 16:31:30.004493 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.004472 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vpndd" Apr 17 16:31:30.006264 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.006236 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:30.006354 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.006295 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.006418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.006354 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7l9qg" Apr 17 16:31:30.007542 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.007523 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5ft4z" Apr 17 16:31:30.008682 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.008668 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.009814 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.009795 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.009922 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.009905 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:30.011245 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.011231 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.012697 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.012669 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.018893 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.018870 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsgsq\" (UniqueName: \"kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq\") pod \"network-check-target-hqwh2\" (UID: \"ffde06b8-a22f-482c-89a5-3fa86598f73d\") " pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:30.018967 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.018907 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljh8x\" (UniqueName: \"kubernetes.io/projected/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-kube-api-access-ljh8x\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.018967 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.018937 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/30498c9f-32f4-458b-914f-a3fc1f718376-agent-certs\") pod \"konnectivity-agent-vpndd\" (UID: \"30498c9f-32f4-458b-914f-a3fc1f718376\") " pod="kube-system/konnectivity-agent-vpndd" Apr 17 16:31:30.019059 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.018982 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-socket-dir\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.019059 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019029 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-log-socket\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.019327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019056 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-run-netns\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.019327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019091 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-var-lib-cni-bin\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.019327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019134 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-systemd\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.019327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019172 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7jsv\" (UniqueName: \"kubernetes.io/projected/d133b405-3379-47da-adb1-775153ea7854-kube-api-access-h7jsv\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.019327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019229 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/30498c9f-32f4-458b-914f-a3fc1f718376-konnectivity-ca\") pod \"konnectivity-agent-vpndd\" (UID: \"30498c9f-32f4-458b-914f-a3fc1f718376\") " pod="kube-system/konnectivity-agent-vpndd" Apr 17 16:31:30.019327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019272 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-sys-fs\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.019327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019300 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-env-overrides\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.019327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019323 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/61ffcc07-b8ef-4fcc-ab95-d8a4d75484df-hosts-file\") pod \"node-resolver-7l9qg\" (UID: \"61ffcc07-b8ef-4fcc-ab95-d8a4d75484df\") " pod="openshift-dns/node-resolver-7l9qg" Apr 17 16:31:30.019623 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019372 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-run-k8s-cni-cncf-io\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.019623 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019409 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-var-lib-cni-multus\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.019623 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019437 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-hostroot\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.019623 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019461 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt57r\" (UniqueName: \"kubernetes.io/projected/f8e1f18a-02d8-4db9-8e72-f140011fc044-kube-api-access-pt57r\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.019623 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019497 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-sysctl-conf\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.019623 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019521 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-host\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.019623 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019543 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d133b405-3379-47da-adb1-775153ea7854-tmp\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.019623 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019564 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-slash\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.019623 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019584 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-etc-openvswitch\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.019623 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019605 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs\") pod \"network-metrics-daemon-598xw\" (UID: \"a6f8630a-c602-4066-a1c1-66f602f947fc\") " pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:30.019623 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019626 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-run\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.020011 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019646 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-sys\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.020011 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019671 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-cni-binary-copy\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.020011 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019693 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb979380-a8c1-43a4-b8ad-f3ba0967a2d7-serviceca\") pod \"node-ca-5ft4z\" (UID: \"eb979380-a8c1-43a4-b8ad-f3ba0967a2d7\") " pod="openshift-image-registry/node-ca-5ft4z" Apr 17 16:31:30.020011 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019727 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-cni-netd\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.020011 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019749 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-multus-cni-dir\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.020011 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019777 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f8e1f18a-02d8-4db9-8e72-f140011fc044-multus-daemon-config\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.020011 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019801 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-etc-kubernetes\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.020011 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019824 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d10aa7a-8020-44ad-9772-7262239be5f1-host-slash\") pod \"iptables-alerter-f8jht\" (UID: \"6d10aa7a-8020-44ad-9772-7262239be5f1\") " pod="openshift-network-operator/iptables-alerter-f8jht" Apr 17 16:31:30.020011 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019846 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-etc-selinux\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.020011 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019870 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx5wd\" (UniqueName: \"kubernetes.io/projected/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-kube-api-access-bx5wd\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.020011 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019936 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-cnibin\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.020011 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019961 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-run-netns\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.020011 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.019985 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dcg6\" (UniqueName: \"kubernetes.io/projected/6d10aa7a-8020-44ad-9772-7262239be5f1-kube-api-access-4dcg6\") pod \"iptables-alerter-f8jht\" (UID: \"6d10aa7a-8020-44ad-9772-7262239be5f1\") " pod="openshift-network-operator/iptables-alerter-f8jht" Apr 17 16:31:30.020011 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020013 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg82g\" (UniqueName: \"kubernetes.io/projected/61ffcc07-b8ef-4fcc-ab95-d8a4d75484df-kube-api-access-jg82g\") pod \"node-resolver-7l9qg\" (UID: \"61ffcc07-b8ef-4fcc-ab95-d8a4d75484df\") " pod="openshift-dns/node-resolver-7l9qg" Apr 17 16:31:30.020534 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020040 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-sysconfig\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.020534 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020085 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.020534 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020116 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8e1f18a-02d8-4db9-8e72-f140011fc044-cni-binary-copy\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.020534 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020140 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6d10aa7a-8020-44ad-9772-7262239be5f1-iptables-alerter-script\") pod \"iptables-alerter-f8jht\" (UID: \"6d10aa7a-8020-44ad-9772-7262239be5f1\") " pod="openshift-network-operator/iptables-alerter-f8jht" Apr 17 16:31:30.020534 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020162 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-cnibin\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.020534 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020190 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-registration-dir\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.020534 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020235 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-device-dir\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.020534 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020253 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-systemd-units\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.020534 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020268 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-run-ovn\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.020534 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020296 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz8qm\" (UniqueName: \"kubernetes.io/projected/a6f8630a-c602-4066-a1c1-66f602f947fc-kube-api-access-rz8qm\") pod \"network-metrics-daemon-598xw\" (UID: \"a6f8630a-c602-4066-a1c1-66f602f947fc\") " pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:30.020534 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020317 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-sysctl-d\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.020534 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020332 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb979380-a8c1-43a4-b8ad-f3ba0967a2d7-host\") pod \"node-ca-5ft4z\" (UID: \"eb979380-a8c1-43a4-b8ad-f3ba0967a2d7\") " pod="openshift-image-registry/node-ca-5ft4z" Apr 17 16:31:30.020534 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020346 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-kubelet\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.020534 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020380 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-run-ovn-kubernetes\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.020534 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020421 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-var-lib-openvswitch\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.020534 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020442 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-cni-bin\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.021035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020456 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-ovn-node-metrics-cert\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.021035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020471 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-modprobe-d\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.021035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020486 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-system-cni-dir\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.021035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020508 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-os-release\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.021035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020524 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.021035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020538 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dgms\" (UniqueName: \"kubernetes.io/projected/eb979380-a8c1-43a4-b8ad-f3ba0967a2d7-kube-api-access-4dgms\") pod \"node-ca-5ft4z\" (UID: \"eb979380-a8c1-43a4-b8ad-f3ba0967a2d7\") " pod="openshift-image-registry/node-ca-5ft4z" Apr 17 16:31:30.021035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020551 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-run-multus-certs\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.021035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020565 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-kubernetes\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.021035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020581 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d133b405-3379-47da-adb1-775153ea7854-etc-tuned\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.021035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020603 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-multus-conf-dir\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.021035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020618 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb45q\" (UniqueName: \"kubernetes.io/projected/5778df28-4298-45d8-b1fa-b84fdd133aa4-kube-api-access-mb45q\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.021035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020640 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-run-systemd\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.021035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020659 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.021035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020680 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-var-lib-kubelet\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.021035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020701 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.021035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020717 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.021483 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020732 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-os-release\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.021483 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020745 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/61ffcc07-b8ef-4fcc-ab95-d8a4d75484df-tmp-dir\") pod \"node-resolver-7l9qg\" (UID: \"61ffcc07-b8ef-4fcc-ab95-d8a4d75484df\") " pod="openshift-dns/node-resolver-7l9qg" Apr 17 16:31:30.021483 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020758 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-ovnkube-config\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.021483 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020773 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-ovnkube-script-lib\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.021483 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020799 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-system-cni-dir\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.021483 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020820 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-var-lib-kubelet\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.021483 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020838 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-run-openvswitch\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.021483 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020851 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-node-log\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.021483 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020863 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-lib-modules\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.021483 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.020877 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-multus-socket-dir-parent\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.032342 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.032318 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 16:31:30.032410 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.032317 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 16:31:30.032410 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.032319 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:30.033617 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.033594 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 16:31:30.033743 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.033718 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 16:31:30.033857 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.033801 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 16:31:30.033857 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.033825 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-d6lcz\"" Apr 17 16:31:30.033969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.033946 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 16:31:30.034029 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.033977 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 16:31:30.034029 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.033977 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 16:31:30.034157 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.034037 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 16:31:30.034375 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.034359 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 16:31:30.034375 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.034373 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 16:31:30.034475 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.034378 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 16:31:30.036196 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.036133 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 16:31:30.036279 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.036217 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 16:31:30.036462 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.036445 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 16:31:30.037984 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.037966 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:30.038305 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.038292 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 16:31:30.038415 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.038401 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:30.038810 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.038795 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 16:31:30.038903 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.038799 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 16:31:30.039184 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.039170 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 16:31:30.039270 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.039170 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 16:31:30.039445 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.039430 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 16:31:30.039525 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.039460 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-b82wc\"" Apr 17 16:31:30.039525 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.039473 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-xwt9g\"" Apr 17 16:31:30.039525 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.039512 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 16:31:30.039525 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.039471 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fcg7b\"" Apr 17 16:31:30.039685 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.039522 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fldzd\"" Apr 17 16:31:30.042787 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.042685 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-z8mmd\"" Apr 17 16:31:30.042787 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.042757 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-xxvfp\"" Apr 17 16:31:30.042787 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.042758 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xhmkt\"" Apr 17 16:31:30.042787 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.042780 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 16:31:30.043028 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.042791 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-v7ztv\"" Apr 17 16:31:30.048561 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.048526 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:30.069766 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.069745 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zkbqp" Apr 17 16:31:30.081570 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.081548 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zkbqp" Apr 17 16:31:30.113435 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.113418 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 16:31:30.121017 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121001 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7jsv\" (UniqueName: \"kubernetes.io/projected/d133b405-3379-47da-adb1-775153ea7854-kube-api-access-h7jsv\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.121110 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121024 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/30498c9f-32f4-458b-914f-a3fc1f718376-konnectivity-ca\") pod \"konnectivity-agent-vpndd\" (UID: \"30498c9f-32f4-458b-914f-a3fc1f718376\") " pod="kube-system/konnectivity-agent-vpndd" Apr 17 16:31:30.121110 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121041 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-sys-fs\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.121110 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121057 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-env-overrides\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.121110 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121099 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/61ffcc07-b8ef-4fcc-ab95-d8a4d75484df-hosts-file\") pod \"node-resolver-7l9qg\" (UID: \"61ffcc07-b8ef-4fcc-ab95-d8a4d75484df\") " pod="openshift-dns/node-resolver-7l9qg" Apr 17 16:31:30.121264 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121124 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-run-k8s-cni-cncf-io\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.121264 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121148 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-var-lib-cni-multus\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.121264 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121146 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-sys-fs\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.121264 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121203 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/61ffcc07-b8ef-4fcc-ab95-d8a4d75484df-hosts-file\") pod \"node-resolver-7l9qg\" (UID: \"61ffcc07-b8ef-4fcc-ab95-d8a4d75484df\") " pod="openshift-dns/node-resolver-7l9qg" Apr 17 16:31:30.121264 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121235 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-var-lib-cni-multus\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.121428 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121267 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-hostroot\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.121428 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121303 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pt57r\" (UniqueName: \"kubernetes.io/projected/f8e1f18a-02d8-4db9-8e72-f140011fc044-kube-api-access-pt57r\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.121428 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121329 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-sysctl-conf\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.121428 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121352 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-host\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.121428 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121209 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-run-k8s-cni-cncf-io\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.121428 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121375 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-hostroot\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.121428 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121394 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d133b405-3379-47da-adb1-775153ea7854-tmp\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.121620 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121439 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-host\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.121620 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121469 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-slash\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.121620 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121498 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-etc-openvswitch\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.121620 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121525 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs\") pod \"network-metrics-daemon-598xw\" (UID: \"a6f8630a-c602-4066-a1c1-66f602f947fc\") " pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:30.121620 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121548 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-run\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.121838 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121667 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-slash\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.121838 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121705 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-etc-openvswitch\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.121838 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121678 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-env-overrides\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.121838 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/30498c9f-32f4-458b-914f-a3fc1f718376-konnectivity-ca\") pod \"konnectivity-agent-vpndd\" (UID: \"30498c9f-32f4-458b-914f-a3fc1f718376\") " pod="kube-system/konnectivity-agent-vpndd" Apr 17 16:31:30.121838 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121676 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 16:31:30.121838 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121746 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-sysctl-conf\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.121838 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:30.121758 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:30.121838 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121799 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-sys\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.121838 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121816 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-run\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.122235 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:30.121850 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs podName:a6f8630a-c602-4066-a1c1-66f602f947fc nodeName:}" failed. No retries permitted until 2026-04-17 16:31:30.62180387 +0000 UTC m=+2.143380800 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs") pod "network-metrics-daemon-598xw" (UID: "a6f8630a-c602-4066-a1c1-66f602f947fc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:30.122235 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121854 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-sys\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.122235 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121884 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-cni-binary-copy\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.122235 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121908 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb979380-a8c1-43a4-b8ad-f3ba0967a2d7-serviceca\") pod \"node-ca-5ft4z\" (UID: \"eb979380-a8c1-43a4-b8ad-f3ba0967a2d7\") " pod="openshift-image-registry/node-ca-5ft4z" Apr 17 16:31:30.122235 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121927 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-cni-netd\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.122235 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121949 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-multus-cni-dir\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.122235 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121980 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-cni-netd\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.122235 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.121997 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f8e1f18a-02d8-4db9-8e72-f140011fc044-multus-daemon-config\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.122235 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122060 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-multus-cni-dir\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.122235 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122120 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-etc-kubernetes\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.122235 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122154 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d10aa7a-8020-44ad-9772-7262239be5f1-host-slash\") pod \"iptables-alerter-f8jht\" (UID: \"6d10aa7a-8020-44ad-9772-7262239be5f1\") " pod="openshift-network-operator/iptables-alerter-f8jht" Apr 17 16:31:30.122235 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-etc-selinux\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.122235 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122203 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bx5wd\" (UniqueName: \"kubernetes.io/projected/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-kube-api-access-bx5wd\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.122235 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122206 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-etc-kubernetes\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.122235 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122217 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d10aa7a-8020-44ad-9772-7262239be5f1-host-slash\") pod \"iptables-alerter-f8jht\" (UID: \"6d10aa7a-8020-44ad-9772-7262239be5f1\") " pod="openshift-network-operator/iptables-alerter-f8jht" Apr 17 16:31:30.122235 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122228 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-cnibin\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.122969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122277 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-cnibin\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.122969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122279 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-run-netns\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.122969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122296 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-etc-selinux\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.122969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122310 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dcg6\" (UniqueName: \"kubernetes.io/projected/6d10aa7a-8020-44ad-9772-7262239be5f1-kube-api-access-4dcg6\") pod \"iptables-alerter-f8jht\" (UID: \"6d10aa7a-8020-44ad-9772-7262239be5f1\") " pod="openshift-network-operator/iptables-alerter-f8jht" Apr 17 16:31:30.122969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122377 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb979380-a8c1-43a4-b8ad-f3ba0967a2d7-serviceca\") pod \"node-ca-5ft4z\" (UID: \"eb979380-a8c1-43a4-b8ad-f3ba0967a2d7\") " pod="openshift-image-registry/node-ca-5ft4z" Apr 17 16:31:30.122969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122427 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-run-netns\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.122969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122453 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg82g\" (UniqueName: \"kubernetes.io/projected/61ffcc07-b8ef-4fcc-ab95-d8a4d75484df-kube-api-access-jg82g\") pod \"node-resolver-7l9qg\" (UID: \"61ffcc07-b8ef-4fcc-ab95-d8a4d75484df\") " pod="openshift-dns/node-resolver-7l9qg" Apr 17 16:31:30.122969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122547 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-sysconfig\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.122969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122675 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-sysconfig\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.122969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122667 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-cni-binary-copy\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.122969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122707 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f8e1f18a-02d8-4db9-8e72-f140011fc044-multus-daemon-config\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.122969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122720 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.122969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8e1f18a-02d8-4db9-8e72-f140011fc044-cni-binary-copy\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.122969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122775 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.122969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122786 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6d10aa7a-8020-44ad-9772-7262239be5f1-iptables-alerter-script\") pod \"iptables-alerter-f8jht\" (UID: \"6d10aa7a-8020-44ad-9772-7262239be5f1\") " pod="openshift-network-operator/iptables-alerter-f8jht" Apr 17 16:31:30.122969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122811 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-cnibin\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.122969 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122834 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-registration-dir\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.123625 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-device-dir\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.123625 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122886 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-cnibin\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.123625 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122881 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-systemd-units\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.123625 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122935 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-run-ovn\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.123625 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122948 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-registration-dir\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.123625 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122961 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rz8qm\" (UniqueName: \"kubernetes.io/projected/a6f8630a-c602-4066-a1c1-66f602f947fc-kube-api-access-rz8qm\") pod \"network-metrics-daemon-598xw\" (UID: \"a6f8630a-c602-4066-a1c1-66f602f947fc\") " pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:30.123625 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.122986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-sysctl-d\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.123625 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123000 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-device-dir\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.123625 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123012 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb979380-a8c1-43a4-b8ad-f3ba0967a2d7-host\") pod \"node-ca-5ft4z\" (UID: \"eb979380-a8c1-43a4-b8ad-f3ba0967a2d7\") " pod="openshift-image-registry/node-ca-5ft4z" Apr 17 16:31:30.123625 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123034 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-kubelet\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.123625 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123057 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-run-ovn-kubernetes\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.123625 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123101 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-var-lib-openvswitch\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.123625 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123131 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-cni-bin\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.123625 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123171 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-sysctl-d\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.123625 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123172 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-ovn-node-metrics-cert\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.123625 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123208 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-modprobe-d\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.123625 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123240 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-system-cni-dir\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.124418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123270 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-os-release\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.124418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123298 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.124418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123309 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8e1f18a-02d8-4db9-8e72-f140011fc044-cni-binary-copy\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.124418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123381 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-system-cni-dir\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.124418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123479 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-modprobe-d\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.124418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123479 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-systemd-units\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.124418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123492 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6d10aa7a-8020-44ad-9772-7262239be5f1-iptables-alerter-script\") pod \"iptables-alerter-f8jht\" (UID: \"6d10aa7a-8020-44ad-9772-7262239be5f1\") " pod="openshift-network-operator/iptables-alerter-f8jht" Apr 17 16:31:30.124418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123545 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-kubelet\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.124418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123559 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-run-ovn-kubernetes\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.124418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123589 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-run-ovn\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.124418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123602 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-os-release\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.124418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123602 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-var-lib-openvswitch\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.124418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123632 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-cni-bin\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.124418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123661 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb979380-a8c1-43a4-b8ad-f3ba0967a2d7-host\") pod \"node-ca-5ft4z\" (UID: \"eb979380-a8c1-43a4-b8ad-f3ba0967a2d7\") " pod="openshift-image-registry/node-ca-5ft4z" Apr 17 16:31:30.124418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123799 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.124418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123938 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dgms\" (UniqueName: \"kubernetes.io/projected/eb979380-a8c1-43a4-b8ad-f3ba0967a2d7-kube-api-access-4dgms\") pod \"node-ca-5ft4z\" (UID: \"eb979380-a8c1-43a4-b8ad-f3ba0967a2d7\") " pod="openshift-image-registry/node-ca-5ft4z" Apr 17 16:31:30.124418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123967 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-run-multus-certs\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.125005 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.123995 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-kubernetes\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.125005 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124018 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d133b405-3379-47da-adb1-775153ea7854-etc-tuned\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.125005 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124056 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-multus-conf-dir\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.125005 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124107 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mb45q\" (UniqueName: \"kubernetes.io/projected/5778df28-4298-45d8-b1fa-b84fdd133aa4-kube-api-access-mb45q\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.125005 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124281 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-kubernetes\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.125005 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124411 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-run-systemd\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.125005 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124484 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-run-systemd\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.125005 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.125005 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124530 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-run-multus-certs\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.125005 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124564 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-var-lib-kubelet\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.125005 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124599 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.125005 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124640 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.125005 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124666 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-os-release\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.125005 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124696 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/61ffcc07-b8ef-4fcc-ab95-d8a4d75484df-tmp-dir\") pod \"node-resolver-7l9qg\" (UID: \"61ffcc07-b8ef-4fcc-ab95-d8a4d75484df\") " pod="openshift-dns/node-resolver-7l9qg" Apr 17 16:31:30.125005 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124738 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-multus-conf-dir\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.125005 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124744 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-ovnkube-config\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.125005 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-ovnkube-script-lib\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.125991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124836 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-system-cni-dir\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.125991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124866 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-var-lib-kubelet\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.125991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124898 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-run-openvswitch\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.125991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124926 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-node-log\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.125991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124950 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-lib-modules\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.125991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.124983 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-multus-socket-dir-parent\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.125991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.125014 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsgsq\" (UniqueName: \"kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq\") pod \"network-check-target-hqwh2\" (UID: \"ffde06b8-a22f-482c-89a5-3fa86598f73d\") " pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:30.125991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.125044 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljh8x\" (UniqueName: \"kubernetes.io/projected/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-kube-api-access-ljh8x\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.125991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.125236 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-ovnkube-config\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.125991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.125317 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-os-release\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.125991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.125788 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.125991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.125787 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.125991 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.125854 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-node-log\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.126617 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126092 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/61ffcc07-b8ef-4fcc-ab95-d8a4d75484df-tmp-dir\") pod \"node-resolver-7l9qg\" (UID: \"61ffcc07-b8ef-4fcc-ab95-d8a4d75484df\") " pod="openshift-dns/node-resolver-7l9qg" Apr 17 16:31:30.126617 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126401 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d133b405-3379-47da-adb1-775153ea7854-tmp\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.126617 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126406 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-ovnkube-script-lib\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.126617 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126474 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-var-lib-kubelet\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.126617 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126487 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-run-openvswitch\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.126617 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126502 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-multus-socket-dir-parent\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.126617 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126529 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-system-cni-dir\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.126617 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126551 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.126617 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126599 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-var-lib-kubelet\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.126617 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126611 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-lib-modules\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.127038 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126699 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/30498c9f-32f4-458b-914f-a3fc1f718376-agent-certs\") pod \"konnectivity-agent-vpndd\" (UID: \"30498c9f-32f4-458b-914f-a3fc1f718376\") " pod="kube-system/konnectivity-agent-vpndd" Apr 17 16:31:30.127038 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126735 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-socket-dir\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.127038 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126757 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-log-socket\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.127038 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126785 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-run-netns\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.127038 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126811 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-var-lib-cni-bin\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.127038 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126837 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-systemd\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.127038 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126859 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-log-socket\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.127038 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126887 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5778df28-4298-45d8-b1fa-b84fdd133aa4-socket-dir\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.127038 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126938 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-host-run-netns\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.127038 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126955 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f8e1f18a-02d8-4db9-8e72-f140011fc044-host-var-lib-cni-bin\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.127038 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.126985 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d133b405-3379-47da-adb1-775153ea7854-etc-systemd\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.128794 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.128757 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/30498c9f-32f4-458b-914f-a3fc1f718376-agent-certs\") pod \"konnectivity-agent-vpndd\" (UID: \"30498c9f-32f4-458b-914f-a3fc1f718376\") " pod="kube-system/konnectivity-agent-vpndd" Apr 17 16:31:30.130547 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.130518 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d133b405-3379-47da-adb1-775153ea7854-etc-tuned\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.132048 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.132024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-ovn-node-metrics-cert\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.132278 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.132262 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt57r\" (UniqueName: \"kubernetes.io/projected/f8e1f18a-02d8-4db9-8e72-f140011fc044-kube-api-access-pt57r\") pod \"multus-lg6kr\" (UID: \"f8e1f18a-02d8-4db9-8e72-f140011fc044\") " pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.132479 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.132454 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7jsv\" (UniqueName: \"kubernetes.io/projected/d133b405-3379-47da-adb1-775153ea7854-kube-api-access-h7jsv\") pod \"tuned-ljs5t\" (UID: \"d133b405-3379-47da-adb1-775153ea7854\") " pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.135647 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:30.135379 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:30.135647 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:30.135404 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:30.135647 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:30.135417 2578 projected.go:194] Error preparing data for projected volume kube-api-access-fsgsq for pod openshift-network-diagnostics/network-check-target-hqwh2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:30.135647 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:30.135491 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq podName:ffde06b8-a22f-482c-89a5-3fa86598f73d nodeName:}" failed. No retries permitted until 2026-04-17 16:31:30.635472378 +0000 UTC m=+2.157049310 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fsgsq" (UniqueName: "kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq") pod "network-check-target-hqwh2" (UID: "ffde06b8-a22f-482c-89a5-3fa86598f73d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:30.136485 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.136459 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb45q\" (UniqueName: \"kubernetes.io/projected/5778df28-4298-45d8-b1fa-b84fdd133aa4-kube-api-access-mb45q\") pod \"aws-ebs-csi-driver-node-tbsjm\" (UID: \"5778df28-4298-45d8-b1fa-b84fdd133aa4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.137095 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.137050 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dcg6\" (UniqueName: \"kubernetes.io/projected/6d10aa7a-8020-44ad-9772-7262239be5f1-kube-api-access-4dcg6\") pod \"iptables-alerter-f8jht\" (UID: \"6d10aa7a-8020-44ad-9772-7262239be5f1\") " pod="openshift-network-operator/iptables-alerter-f8jht" Apr 17 16:31:30.137095 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.137090 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dgms\" (UniqueName: \"kubernetes.io/projected/eb979380-a8c1-43a4-b8ad-f3ba0967a2d7-kube-api-access-4dgms\") pod \"node-ca-5ft4z\" (UID: \"eb979380-a8c1-43a4-b8ad-f3ba0967a2d7\") " pod="openshift-image-registry/node-ca-5ft4z" Apr 17 16:31:30.137592 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.137569 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg82g\" (UniqueName: \"kubernetes.io/projected/61ffcc07-b8ef-4fcc-ab95-d8a4d75484df-kube-api-access-jg82g\") pod \"node-resolver-7l9qg\" (UID: \"61ffcc07-b8ef-4fcc-ab95-d8a4d75484df\") " pod="openshift-dns/node-resolver-7l9qg" Apr 17 16:31:30.137708 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.137688 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz8qm\" (UniqueName: \"kubernetes.io/projected/a6f8630a-c602-4066-a1c1-66f602f947fc-kube-api-access-rz8qm\") pod \"network-metrics-daemon-598xw\" (UID: \"a6f8630a-c602-4066-a1c1-66f602f947fc\") " pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:30.137794 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.137780 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx5wd\" (UniqueName: \"kubernetes.io/projected/e9449b84-7aaa-4237-8ea9-618f1fb0c8be-kube-api-access-bx5wd\") pod \"ovnkube-node-8jknk\" (UID: \"e9449b84-7aaa-4237-8ea9-618f1fb0c8be\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.138996 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.138968 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljh8x\" (UniqueName: \"kubernetes.io/projected/8039245d-5cc0-42eb-bd46-e84c3ff6d2dd-kube-api-access-ljh8x\") pod \"multus-additional-cni-plugins-gdzlp\" (UID: \"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd\") " pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.141585 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.141571 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lg6kr" Apr 17 16:31:30.162081 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:30.162042 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9559efac8a150a05024d8f64b4bca67.slice/crio-68559ef604951ef493d2058022e050d8853df64083b1f62b2a435a97b94e5bbe WatchSource:0}: Error finding container 68559ef604951ef493d2058022e050d8853df64083b1f62b2a435a97b94e5bbe: Status 404 returned error can't find the container with id 68559ef604951ef493d2058022e050d8853df64083b1f62b2a435a97b94e5bbe Apr 17 16:31:30.162700 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:30.162674 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8e1f18a_02d8_4db9_8e72_f140011fc044.slice/crio-009df39c0b00c704736b3b7f28ffb576b844949747a8c5d6981f8a8cea518986 WatchSource:0}: Error finding container 009df39c0b00c704736b3b7f28ffb576b844949747a8c5d6981f8a8cea518986: Status 404 returned error can't find the container with id 009df39c0b00c704736b3b7f28ffb576b844949747a8c5d6981f8a8cea518986 Apr 17 16:31:30.163630 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:30.163608 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72d3944a84d00c65c1b4be69187354b2.slice/crio-3765fe2bdbda0f8662e7764a9020876ddda6519a532e489cd71ee06cef9b4153 WatchSource:0}: Error finding container 3765fe2bdbda0f8662e7764a9020876ddda6519a532e489cd71ee06cef9b4153: Status 404 returned error can't find the container with id 3765fe2bdbda0f8662e7764a9020876ddda6519a532e489cd71ee06cef9b4153 Apr 17 16:31:30.167642 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.167621 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:31:30.167817 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.167781 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-170.ec2.internal" event={"ID":"72d3944a84d00c65c1b4be69187354b2","Type":"ContainerStarted","Data":"3765fe2bdbda0f8662e7764a9020876ddda6519a532e489cd71ee06cef9b4153"} Apr 17 16:31:30.168879 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.168859 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lg6kr" event={"ID":"f8e1f18a-02d8-4db9-8e72-f140011fc044","Type":"ContainerStarted","Data":"009df39c0b00c704736b3b7f28ffb576b844949747a8c5d6981f8a8cea518986"} Apr 17 16:31:30.170080 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.169974 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal" event={"ID":"b9559efac8a150a05024d8f64b4bca67","Type":"ContainerStarted","Data":"68559ef604951ef493d2058022e050d8853df64083b1f62b2a435a97b94e5bbe"} Apr 17 16:31:30.289567 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.289509 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:30.333565 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.333540 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f8jht" Apr 17 16:31:30.340059 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:30.340037 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d10aa7a_8020_44ad_9772_7262239be5f1.slice/crio-07ebcfe899c9027499bab62676f17a29d3b4202b8a0fdfc8440b84916814f7ef WatchSource:0}: Error finding container 07ebcfe899c9027499bab62676f17a29d3b4202b8a0fdfc8440b84916814f7ef: Status 404 returned error can't find the container with id 07ebcfe899c9027499bab62676f17a29d3b4202b8a0fdfc8440b84916814f7ef Apr 17 16:31:30.350365 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.350349 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vpndd" Apr 17 16:31:30.356415 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:30.356389 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30498c9f_32f4_458b_914f_a3fc1f718376.slice/crio-9f823f3c11d845855d54c041159fe1eb5d88bbc43ba5435c0ca7c76e99874f1a WatchSource:0}: Error finding container 9f823f3c11d845855d54c041159fe1eb5d88bbc43ba5435c0ca7c76e99874f1a: Status 404 returned error can't find the container with id 9f823f3c11d845855d54c041159fe1eb5d88bbc43ba5435c0ca7c76e99874f1a Apr 17 16:31:30.369909 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.369891 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" Apr 17 16:31:30.376980 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:30.376959 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5778df28_4298_45d8_b1fa_b84fdd133aa4.slice/crio-6861047ef5cc8753be17bfee933a57dba2aac9ac4f4a509cc33637fd35002d9d WatchSource:0}: Error finding container 6861047ef5cc8753be17bfee933a57dba2aac9ac4f4a509cc33637fd35002d9d: Status 404 returned error can't find the container with id 6861047ef5cc8753be17bfee933a57dba2aac9ac4f4a509cc33637fd35002d9d Apr 17 16:31:30.380730 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.380715 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7l9qg" Apr 17 16:31:30.386280 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:30.386260 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61ffcc07_b8ef_4fcc_ab95_d8a4d75484df.slice/crio-00e6ec1adbf5c3a9af25bbd57a0694b8a7ad6ea8564112fa08678f4ba162ec97 WatchSource:0}: Error finding container 00e6ec1adbf5c3a9af25bbd57a0694b8a7ad6ea8564112fa08678f4ba162ec97: Status 404 returned error can't find the container with id 00e6ec1adbf5c3a9af25bbd57a0694b8a7ad6ea8564112fa08678f4ba162ec97 Apr 17 16:31:30.397617 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.397600 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5ft4z" Apr 17 16:31:30.402795 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.402781 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:30.402991 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:30.402972 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb979380_a8c1_43a4_b8ad_f3ba0967a2d7.slice/crio-0ac30f7be6c57bb3a279197e08f432e446a2890d4f908b17b5d57dad754f74d1 WatchSource:0}: Error finding container 0ac30f7be6c57bb3a279197e08f432e446a2890d4f908b17b5d57dad754f74d1: Status 404 returned error can't find the container with id 0ac30f7be6c57bb3a279197e08f432e446a2890d4f908b17b5d57dad754f74d1 Apr 17 16:31:30.408595 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:30.408573 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9449b84_7aaa_4237_8ea9_618f1fb0c8be.slice/crio-3437c1d6a1459e69bccdbdd8048ebce0b83672d7ac5c03c21853550de57bab49 WatchSource:0}: Error finding container 3437c1d6a1459e69bccdbdd8048ebce0b83672d7ac5c03c21853550de57bab49: Status 404 returned error can't find the container with id 3437c1d6a1459e69bccdbdd8048ebce0b83672d7ac5c03c21853550de57bab49 Apr 17 16:31:30.417245 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.417227 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" Apr 17 16:31:30.422804 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:30.422786 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd133b405_3379_47da_adb1_775153ea7854.slice/crio-7aedcc673ed2957411569c52a037855fd6f44f30972ca36b225e7c84cf1ecdac WatchSource:0}: Error finding container 7aedcc673ed2957411569c52a037855fd6f44f30972ca36b225e7c84cf1ecdac: Status 404 returned error can't find the container with id 7aedcc673ed2957411569c52a037855fd6f44f30972ca36b225e7c84cf1ecdac Apr 17 16:31:30.423900 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.423881 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gdzlp" Apr 17 16:31:30.429520 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:31:30.429499 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8039245d_5cc0_42eb_bd46_e84c3ff6d2dd.slice/crio-d82022d055146b98e33164b371469f4beeb0c95025bf8ee38197776038f50e3f WatchSource:0}: Error finding container d82022d055146b98e33164b371469f4beeb0c95025bf8ee38197776038f50e3f: Status 404 returned error can't find the container with id d82022d055146b98e33164b371469f4beeb0c95025bf8ee38197776038f50e3f Apr 17 16:31:30.630869 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.630785 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs\") pod \"network-metrics-daemon-598xw\" (UID: \"a6f8630a-c602-4066-a1c1-66f602f947fc\") " pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:30.631025 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:30.630919 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:30.631025 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:30.630998 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs podName:a6f8630a-c602-4066-a1c1-66f602f947fc nodeName:}" failed. No retries permitted until 2026-04-17 16:31:31.630978772 +0000 UTC m=+3.152555717 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs") pod "network-metrics-daemon-598xw" (UID: "a6f8630a-c602-4066-a1c1-66f602f947fc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:30.731903 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.731867 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsgsq\" (UniqueName: \"kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq\") pod \"network-check-target-hqwh2\" (UID: \"ffde06b8-a22f-482c-89a5-3fa86598f73d\") " pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:30.732117 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:30.732098 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:30.732194 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:30.732128 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:30.732194 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:30.732140 2578 projected.go:194] Error preparing data for projected volume kube-api-access-fsgsq for pod openshift-network-diagnostics/network-check-target-hqwh2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:30.732291 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:30.732209 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq podName:ffde06b8-a22f-482c-89a5-3fa86598f73d nodeName:}" failed. No retries permitted until 2026-04-17 16:31:31.732188706 +0000 UTC m=+3.253765638 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fsgsq" (UniqueName: "kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq") pod "network-check-target-hqwh2" (UID: "ffde06b8-a22f-482c-89a5-3fa86598f73d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:30.750829 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.750801 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:30.783131 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:30.783100 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:31.083006 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:31.082923 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:30 +0000 UTC" deadline="2027-09-16 04:25:51.464986202 +0000 UTC" Apr 17 16:31:31.083006 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:31.082960 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12395h54m20.382029783s" Apr 17 16:31:31.167799 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:31.167305 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:31.167799 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:31.167418 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqwh2" podUID="ffde06b8-a22f-482c-89a5-3fa86598f73d" Apr 17 16:31:31.187142 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:31.187113 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5ft4z" event={"ID":"eb979380-a8c1-43a4-b8ad-f3ba0967a2d7","Type":"ContainerStarted","Data":"0ac30f7be6c57bb3a279197e08f432e446a2890d4f908b17b5d57dad754f74d1"} Apr 17 16:31:31.208817 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:31.208752 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7l9qg" event={"ID":"61ffcc07-b8ef-4fcc-ab95-d8a4d75484df","Type":"ContainerStarted","Data":"00e6ec1adbf5c3a9af25bbd57a0694b8a7ad6ea8564112fa08678f4ba162ec97"} Apr 17 16:31:31.222480 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:31.222447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vpndd" event={"ID":"30498c9f-32f4-458b-914f-a3fc1f718376","Type":"ContainerStarted","Data":"9f823f3c11d845855d54c041159fe1eb5d88bbc43ba5435c0ca7c76e99874f1a"} Apr 17 16:31:31.244015 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:31.243981 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdzlp" event={"ID":"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd","Type":"ContainerStarted","Data":"d82022d055146b98e33164b371469f4beeb0c95025bf8ee38197776038f50e3f"} Apr 17 16:31:31.261161 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:31.261129 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" event={"ID":"d133b405-3379-47da-adb1-775153ea7854","Type":"ContainerStarted","Data":"7aedcc673ed2957411569c52a037855fd6f44f30972ca36b225e7c84cf1ecdac"} Apr 17 16:31:31.283742 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:31.283711 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" event={"ID":"e9449b84-7aaa-4237-8ea9-618f1fb0c8be","Type":"ContainerStarted","Data":"3437c1d6a1459e69bccdbdd8048ebce0b83672d7ac5c03c21853550de57bab49"} Apr 17 16:31:31.309003 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:31.308964 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" event={"ID":"5778df28-4298-45d8-b1fa-b84fdd133aa4","Type":"ContainerStarted","Data":"6861047ef5cc8753be17bfee933a57dba2aac9ac4f4a509cc33637fd35002d9d"} Apr 17 16:31:31.323284 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:31.323256 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f8jht" event={"ID":"6d10aa7a-8020-44ad-9772-7262239be5f1","Type":"ContainerStarted","Data":"07ebcfe899c9027499bab62676f17a29d3b4202b8a0fdfc8440b84916814f7ef"} Apr 17 16:31:31.644468 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:31.643947 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs\") pod \"network-metrics-daemon-598xw\" (UID: \"a6f8630a-c602-4066-a1c1-66f602f947fc\") " pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:31.644468 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:31.644116 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:31.644468 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:31.644175 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs podName:a6f8630a-c602-4066-a1c1-66f602f947fc nodeName:}" failed. No retries permitted until 2026-04-17 16:31:33.644156832 +0000 UTC m=+5.165733780 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs") pod "network-metrics-daemon-598xw" (UID: "a6f8630a-c602-4066-a1c1-66f602f947fc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:31.744351 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:31.744300 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsgsq\" (UniqueName: \"kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq\") pod \"network-check-target-hqwh2\" (UID: \"ffde06b8-a22f-482c-89a5-3fa86598f73d\") " pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:31.744526 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:31.744445 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:31.744526 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:31.744465 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:31.744526 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:31.744478 2578 projected.go:194] Error preparing data for projected volume kube-api-access-fsgsq for pod openshift-network-diagnostics/network-check-target-hqwh2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:31.744695 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:31.744530 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq podName:ffde06b8-a22f-482c-89a5-3fa86598f73d nodeName:}" failed. No retries permitted until 2026-04-17 16:31:33.74451296 +0000 UTC m=+5.266089893 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fsgsq" (UniqueName: "kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq") pod "network-check-target-hqwh2" (UID: "ffde06b8-a22f-482c-89a5-3fa86598f73d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:31.834844 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:31.834806 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:32.083602 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:32.083517 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:30 +0000 UTC" deadline="2027-12-31 16:30:52.782001536 +0000 UTC" Apr 17 16:31:32.083602 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:32.083556 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14951h59m20.698448696s" Apr 17 16:31:32.166162 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:32.165994 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:32.166162 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:32.166138 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:31:33.165593 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:33.165556 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:33.166046 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:33.165689 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqwh2" podUID="ffde06b8-a22f-482c-89a5-3fa86598f73d" Apr 17 16:31:33.659399 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:33.659322 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs\") pod \"network-metrics-daemon-598xw\" (UID: \"a6f8630a-c602-4066-a1c1-66f602f947fc\") " pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:33.659570 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:33.659492 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:33.659570 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:33.659553 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs podName:a6f8630a-c602-4066-a1c1-66f602f947fc nodeName:}" failed. No retries permitted until 2026-04-17 16:31:37.659533393 +0000 UTC m=+9.181110329 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs") pod "network-metrics-daemon-598xw" (UID: "a6f8630a-c602-4066-a1c1-66f602f947fc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:33.759921 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:33.759886 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsgsq\" (UniqueName: \"kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq\") pod \"network-check-target-hqwh2\" (UID: \"ffde06b8-a22f-482c-89a5-3fa86598f73d\") " pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:33.760122 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:33.760084 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:33.760122 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:33.760104 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:33.760122 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:33.760114 2578 projected.go:194] Error preparing data for projected volume kube-api-access-fsgsq for pod openshift-network-diagnostics/network-check-target-hqwh2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:33.760286 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:33.760165 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq podName:ffde06b8-a22f-482c-89a5-3fa86598f73d nodeName:}" failed. No retries permitted until 2026-04-17 16:31:37.760146489 +0000 UTC m=+9.281723434 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fsgsq" (UniqueName: "kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq") pod "network-check-target-hqwh2" (UID: "ffde06b8-a22f-482c-89a5-3fa86598f73d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:34.166128 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:34.166095 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:34.166476 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:34.166231 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:31:35.166165 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:35.166126 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:35.166579 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:35.166235 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqwh2" podUID="ffde06b8-a22f-482c-89a5-3fa86598f73d" Apr 17 16:31:36.165391 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:36.165354 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:36.165562 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:36.165495 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:31:37.168990 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:37.168961 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:37.169447 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:37.169100 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqwh2" podUID="ffde06b8-a22f-482c-89a5-3fa86598f73d" Apr 17 16:31:37.691561 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:37.691524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs\") pod \"network-metrics-daemon-598xw\" (UID: \"a6f8630a-c602-4066-a1c1-66f602f947fc\") " pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:37.691736 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:37.691702 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:37.691801 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:37.691763 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs podName:a6f8630a-c602-4066-a1c1-66f602f947fc nodeName:}" failed. No retries permitted until 2026-04-17 16:31:45.691744425 +0000 UTC m=+17.213321359 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs") pod "network-metrics-daemon-598xw" (UID: "a6f8630a-c602-4066-a1c1-66f602f947fc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:37.792732 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:37.792699 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsgsq\" (UniqueName: \"kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq\") pod \"network-check-target-hqwh2\" (UID: \"ffde06b8-a22f-482c-89a5-3fa86598f73d\") " pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:37.792932 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:37.792906 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:37.792996 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:37.792936 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:37.792996 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:37.792947 2578 projected.go:194] Error preparing data for projected volume kube-api-access-fsgsq for pod openshift-network-diagnostics/network-check-target-hqwh2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:37.793090 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:37.793009 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq podName:ffde06b8-a22f-482c-89a5-3fa86598f73d nodeName:}" failed. No retries permitted until 2026-04-17 16:31:45.792994423 +0000 UTC m=+17.314571367 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fsgsq" (UniqueName: "kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq") pod "network-check-target-hqwh2" (UID: "ffde06b8-a22f-482c-89a5-3fa86598f73d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:38.165906 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:38.165720 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:38.165906 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:38.165862 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:31:39.166220 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:39.166190 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:39.166604 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:39.166268 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqwh2" podUID="ffde06b8-a22f-482c-89a5-3fa86598f73d" Apr 17 16:31:40.165797 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:40.165768 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:40.165988 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:40.165885 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:31:41.166200 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:41.166167 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:41.166601 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:41.166296 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqwh2" podUID="ffde06b8-a22f-482c-89a5-3fa86598f73d" Apr 17 16:31:42.165880 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:42.165844 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:42.166059 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:42.165975 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:31:43.166296 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:43.166261 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:43.166726 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:43.166397 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqwh2" podUID="ffde06b8-a22f-482c-89a5-3fa86598f73d" Apr 17 16:31:44.165880 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:44.165844 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:44.166036 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:44.165971 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:31:45.165888 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:45.165853 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:45.166304 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:45.165972 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqwh2" podUID="ffde06b8-a22f-482c-89a5-3fa86598f73d" Apr 17 16:31:45.753432 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:45.753399 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs\") pod \"network-metrics-daemon-598xw\" (UID: \"a6f8630a-c602-4066-a1c1-66f602f947fc\") " pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:45.753642 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:45.753522 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:45.753642 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:45.753597 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs podName:a6f8630a-c602-4066-a1c1-66f602f947fc nodeName:}" failed. No retries permitted until 2026-04-17 16:32:01.753576162 +0000 UTC m=+33.275153095 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs") pod "network-metrics-daemon-598xw" (UID: "a6f8630a-c602-4066-a1c1-66f602f947fc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:45.854288 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:45.854252 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsgsq\" (UniqueName: \"kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq\") pod \"network-check-target-hqwh2\" (UID: \"ffde06b8-a22f-482c-89a5-3fa86598f73d\") " pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:45.854466 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:45.854395 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:45.854466 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:45.854411 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:45.854466 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:45.854421 2578 projected.go:194] Error preparing data for projected volume kube-api-access-fsgsq for pod openshift-network-diagnostics/network-check-target-hqwh2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:45.854595 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:45.854477 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq podName:ffde06b8-a22f-482c-89a5-3fa86598f73d nodeName:}" failed. No retries permitted until 2026-04-17 16:32:01.854461334 +0000 UTC m=+33.376038287 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fsgsq" (UniqueName: "kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq") pod "network-check-target-hqwh2" (UID: "ffde06b8-a22f-482c-89a5-3fa86598f73d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:46.166019 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:46.165989 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:46.166476 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:46.166136 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:31:47.165613 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:47.165580 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:47.165790 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:47.165696 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqwh2" podUID="ffde06b8-a22f-482c-89a5-3fa86598f73d" Apr 17 16:31:48.165435 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:48.165402 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:48.165879 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:48.165545 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:31:49.165906 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:49.165881 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:49.166300 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:49.165963 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqwh2" podUID="ffde06b8-a22f-482c-89a5-3fa86598f73d" Apr 17 16:31:49.366313 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:49.366258 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lg6kr" event={"ID":"f8e1f18a-02d8-4db9-8e72-f140011fc044","Type":"ContainerStarted","Data":"e8e7b8992df5ac95ddbf0492e5eede22a1a2746fcb7c517198494b3fbf2923ff"} Apr 17 16:31:49.367540 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:49.367518 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" event={"ID":"d133b405-3379-47da-adb1-775153ea7854","Type":"ContainerStarted","Data":"4db098e23846b104a623c44f6210e85289ccc019ff9bb6571ff51e91702aab0c"} Apr 17 16:31:49.369547 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:49.369520 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" event={"ID":"e9449b84-7aaa-4237-8ea9-618f1fb0c8be","Type":"ContainerStarted","Data":"0a69db4b2c790c0a5929cbd0f4849f3b38652f8ae357585bcf01ca74cdf544e6"} Apr 17 16:31:49.369652 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:49.369555 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" event={"ID":"e9449b84-7aaa-4237-8ea9-618f1fb0c8be","Type":"ContainerStarted","Data":"7ec968e7b357aff49a06353036495407c62a4768c389d212046e7042bbe4c4f7"} Apr 17 16:31:49.369652 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:49.369572 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" event={"ID":"e9449b84-7aaa-4237-8ea9-618f1fb0c8be","Type":"ContainerStarted","Data":"8fc31e78cd97d1d084bdea4413c66abe841796d53aad04406177b51447442523"} Apr 17 16:31:49.370782 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:49.370748 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-170.ec2.internal" event={"ID":"72d3944a84d00c65c1b4be69187354b2","Type":"ContainerStarted","Data":"f37a36291b13f518b44f54a643a0edc753b440e5ffd7b15a12af362c80093296"} Apr 17 16:31:49.384091 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:49.384038 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lg6kr" podStartSLOduration=1.655937392 podStartE2EDuration="20.384027331s" podCreationTimestamp="2026-04-17 16:31:29 +0000 UTC" firstStartedPulling="2026-04-17 16:31:30.167912881 +0000 UTC m=+1.689489809" lastFinishedPulling="2026-04-17 16:31:48.896002818 +0000 UTC m=+20.417579748" observedRunningTime="2026-04-17 16:31:49.383904533 +0000 UTC m=+20.905481496" watchObservedRunningTime="2026-04-17 16:31:49.384027331 +0000 UTC m=+20.905604292" Apr 17 16:31:49.418471 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:49.418436 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ljs5t" podStartSLOduration=2.0119891 podStartE2EDuration="20.418425388s" podCreationTimestamp="2026-04-17 16:31:29 +0000 UTC" firstStartedPulling="2026-04-17 16:31:30.424808791 +0000 UTC m=+1.946385719" lastFinishedPulling="2026-04-17 16:31:48.831245062 +0000 UTC m=+20.352822007" observedRunningTime="2026-04-17 16:31:49.404186266 +0000 UTC m=+20.925763218" watchObservedRunningTime="2026-04-17 16:31:49.418425388 +0000 UTC m=+20.940002338" Apr 17 16:31:50.165883 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:50.165708 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:50.166032 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:50.165968 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:31:50.375225 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:50.375199 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 16:31:50.375492 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:50.375469 2578 generic.go:358] "Generic (PLEG): container finished" podID="e9449b84-7aaa-4237-8ea9-618f1fb0c8be" containerID="7ec968e7b357aff49a06353036495407c62a4768c389d212046e7042bbe4c4f7" exitCode=1 Apr 17 16:31:50.375557 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:50.375538 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" event={"ID":"e9449b84-7aaa-4237-8ea9-618f1fb0c8be","Type":"ContainerDied","Data":"7ec968e7b357aff49a06353036495407c62a4768c389d212046e7042bbe4c4f7"} Apr 17 16:31:50.375614 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:50.375568 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" event={"ID":"e9449b84-7aaa-4237-8ea9-618f1fb0c8be","Type":"ContainerStarted","Data":"f41cfb5da712c5397c5459192182d5f8517b28d5bca605457edc1af8c25e6f98"} Apr 17 16:31:50.375614 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:50.375580 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" event={"ID":"e9449b84-7aaa-4237-8ea9-618f1fb0c8be","Type":"ContainerStarted","Data":"9c3ce973aeb612786ca3b8f266869b9fdb9f0ab1820a864fa5136ad62b0d91af"} Apr 17 16:31:50.375614 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:50.375588 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" event={"ID":"e9449b84-7aaa-4237-8ea9-618f1fb0c8be","Type":"ContainerStarted","Data":"f1dcb5dfd0e0afd8cfdb695a9a68db59d56b22bf342476c19e16e7aa68bcbd40"} Apr 17 16:31:50.376816 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:50.376791 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f8jht" event={"ID":"6d10aa7a-8020-44ad-9772-7262239be5f1","Type":"ContainerStarted","Data":"027d0850cc323d5d32dd224eaa13d7894ee982d7c3794b33c4297dbb4c56407d"} Apr 17 16:31:50.392347 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:50.392311 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-170.ec2.internal" podStartSLOduration=21.392298038 podStartE2EDuration="21.392298038s" podCreationTimestamp="2026-04-17 16:31:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:49.418473569 +0000 UTC m=+20.940050516" watchObservedRunningTime="2026-04-17 16:31:50.392298038 +0000 UTC m=+21.913874984" Apr 17 16:31:50.392709 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:50.392684 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-f8jht" podStartSLOduration=2.882054476 podStartE2EDuration="21.392677136s" podCreationTimestamp="2026-04-17 16:31:29 +0000 UTC" firstStartedPulling="2026-04-17 16:31:30.341597956 +0000 UTC m=+1.863174884" lastFinishedPulling="2026-04-17 16:31:48.852220602 +0000 UTC m=+20.373797544" observedRunningTime="2026-04-17 16:31:50.392433384 +0000 UTC m=+21.914010334" watchObservedRunningTime="2026-04-17 16:31:50.392677136 +0000 UTC m=+21.914254086" Apr 17 16:31:51.165706 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:51.165674 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:51.165851 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:51.165793 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqwh2" podUID="ffde06b8-a22f-482c-89a5-3fa86598f73d" Apr 17 16:31:52.166354 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:52.166191 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:52.166666 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:52.166432 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:31:52.382801 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:52.382769 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 16:31:52.383395 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:52.383371 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" event={"ID":"e9449b84-7aaa-4237-8ea9-618f1fb0c8be","Type":"ContainerStarted","Data":"b7407896f8f21c1f9c84cfa5819df23ed041aaf9808084f23b58468b6c18c5b2"} Apr 17 16:31:53.166196 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:53.166171 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:53.166340 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:53.166266 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqwh2" podUID="ffde06b8-a22f-482c-89a5-3fa86598f73d" Apr 17 16:31:53.386278 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:53.386245 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vpndd" event={"ID":"30498c9f-32f4-458b-914f-a3fc1f718376","Type":"ContainerStarted","Data":"50a3a25751dd77d5db094da18d8fcb896022bcb58ed2a9eb07665dfef4394eac"} Apr 17 16:31:53.387592 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:53.387565 2578 generic.go:358] "Generic (PLEG): container finished" podID="b9559efac8a150a05024d8f64b4bca67" containerID="01d9530729a88c6a2fb4924e25dfd9ccfc7bb89a4566f9379431ee079e2e481a" exitCode=0 Apr 17 16:31:53.387718 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:53.387620 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal" event={"ID":"b9559efac8a150a05024d8f64b4bca67","Type":"ContainerDied","Data":"01d9530729a88c6a2fb4924e25dfd9ccfc7bb89a4566f9379431ee079e2e481a"} Apr 17 16:31:53.388929 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:53.388904 2578 generic.go:358] "Generic (PLEG): container finished" podID="8039245d-5cc0-42eb-bd46-e84c3ff6d2dd" containerID="8030c3a19a921a5fb9a2d7a85b65953104034ea31d54b39363f032f092fb376d" exitCode=0 Apr 17 16:31:53.389025 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:53.388977 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdzlp" event={"ID":"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd","Type":"ContainerDied","Data":"8030c3a19a921a5fb9a2d7a85b65953104034ea31d54b39363f032f092fb376d"} Apr 17 16:31:53.390395 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:53.390333 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" event={"ID":"5778df28-4298-45d8-b1fa-b84fdd133aa4","Type":"ContainerStarted","Data":"2f68f3659d2a8f6ddda053f99ad658f387790ac08de8bf433980bfdd1462ca63"} Apr 17 16:31:53.391528 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:53.391507 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5ft4z" event={"ID":"eb979380-a8c1-43a4-b8ad-f3ba0967a2d7","Type":"ContainerStarted","Data":"146b77e0f8a0970d0d6bb1a0eefec7740415f9dc96389cadb8d4d0d334a0938f"} Apr 17 16:31:53.392802 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:53.392785 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7l9qg" event={"ID":"61ffcc07-b8ef-4fcc-ab95-d8a4d75484df","Type":"ContainerStarted","Data":"a0f24c34d605250c9a83486d3980dd51f74be868be9ca0d1b0c71f9a6ad86c00"} Apr 17 16:31:53.404354 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:53.404319 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vpndd" podStartSLOduration=5.9742834479999996 podStartE2EDuration="24.404309991s" podCreationTimestamp="2026-04-17 16:31:29 +0000 UTC" firstStartedPulling="2026-04-17 16:31:30.357810117 +0000 UTC m=+1.879387045" lastFinishedPulling="2026-04-17 16:31:48.787836645 +0000 UTC m=+20.309413588" observedRunningTime="2026-04-17 16:31:53.404276614 +0000 UTC m=+24.925853563" watchObservedRunningTime="2026-04-17 16:31:53.404309991 +0000 UTC m=+24.925886941" Apr 17 16:31:53.432497 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:53.432423 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7l9qg" podStartSLOduration=5.964634641 podStartE2EDuration="24.432411596s" podCreationTimestamp="2026-04-17 16:31:29 +0000 UTC" firstStartedPulling="2026-04-17 16:31:30.38759436 +0000 UTC m=+1.909171288" lastFinishedPulling="2026-04-17 16:31:48.855371301 +0000 UTC m=+20.376948243" observedRunningTime="2026-04-17 16:31:53.432393872 +0000 UTC m=+24.953970821" watchObservedRunningTime="2026-04-17 16:31:53.432411596 +0000 UTC m=+24.953988545" Apr 17 16:31:53.474402 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:53.474361 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5ft4z" podStartSLOduration=6.092087065 podStartE2EDuration="24.47434664s" podCreationTimestamp="2026-04-17 16:31:29 +0000 UTC" firstStartedPulling="2026-04-17 16:31:30.405578455 +0000 UTC m=+1.927155382" lastFinishedPulling="2026-04-17 16:31:48.787838016 +0000 UTC m=+20.309414957" observedRunningTime="2026-04-17 16:31:53.473943549 +0000 UTC m=+24.995520501" watchObservedRunningTime="2026-04-17 16:31:53.47434664 +0000 UTC m=+24.995923592" Apr 17 16:31:53.845377 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:53.845349 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 16:31:54.085360 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:54.085264 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T16:31:53.845373549Z","UUID":"e32dcb9f-7c70-47f7-81a1-5783e02f4d73","Handler":null,"Name":"","Endpoint":""} Apr 17 16:31:54.086872 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:54.086847 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 16:31:54.087015 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:54.086885 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 16:31:54.165829 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:54.165806 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:54.165943 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:54.165918 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:31:54.398257 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:54.398176 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal" event={"ID":"b9559efac8a150a05024d8f64b4bca67","Type":"ContainerStarted","Data":"fd71911ff272f972ee939c7b74578026bcb38393e426c6cf159e30668e77397b"} Apr 17 16:31:54.402418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:54.402394 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 16:31:54.402786 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:54.402759 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" event={"ID":"e9449b84-7aaa-4237-8ea9-618f1fb0c8be","Type":"ContainerStarted","Data":"1f8ebe4cf388e6e6833fffde1eddb4b1b8bfbd9b385291b94f8ad74573905350"} Apr 17 16:31:54.403231 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:54.403213 2578 scope.go:117] "RemoveContainer" containerID="7ec968e7b357aff49a06353036495407c62a4768c389d212046e7042bbe4c4f7" Apr 17 16:31:54.403874 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:54.403815 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:54.403874 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:54.403839 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:54.403874 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:54.403847 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:54.406453 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:54.406412 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" event={"ID":"5778df28-4298-45d8-b1fa-b84fdd133aa4","Type":"ContainerStarted","Data":"0d2ad377376fa6562b2442e954641a0d207f577eddfbeb71cf70857cbda94fa8"} Apr 17 16:31:54.426430 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:54.426356 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:54.427313 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:54.427295 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:31:54.459614 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:54.459454 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-170.ec2.internal" podStartSLOduration=25.45944091 podStartE2EDuration="25.45944091s" podCreationTimestamp="2026-04-17 16:31:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:54.458953157 +0000 UTC m=+25.980530108" watchObservedRunningTime="2026-04-17 16:31:54.45944091 +0000 UTC m=+25.981017861" Apr 17 16:31:55.165994 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:55.165963 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:55.166186 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:55.166135 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqwh2" podUID="ffde06b8-a22f-482c-89a5-3fa86598f73d" Apr 17 16:31:55.413781 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:55.413707 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 16:31:55.414402 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:55.414361 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" event={"ID":"e9449b84-7aaa-4237-8ea9-618f1fb0c8be","Type":"ContainerStarted","Data":"e86172301b9386839fbf14f08eddd2adf056f33bc2db850740c7c1e6da8210e4"} Apr 17 16:31:55.429640 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:55.429554 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" event={"ID":"5778df28-4298-45d8-b1fa-b84fdd133aa4","Type":"ContainerStarted","Data":"a2d6d65415a855bd87ef478335b820a8ae2ea2609ab37a24b06df8f6f1e46439"} Apr 17 16:31:55.455327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:55.455266 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" podStartSLOduration=7.844366049 podStartE2EDuration="26.455247289s" podCreationTimestamp="2026-04-17 16:31:29 +0000 UTC" firstStartedPulling="2026-04-17 16:31:30.409954661 +0000 UTC m=+1.931531589" lastFinishedPulling="2026-04-17 16:31:49.020835887 +0000 UTC m=+20.542412829" observedRunningTime="2026-04-17 16:31:55.454325537 +0000 UTC m=+26.975902488" watchObservedRunningTime="2026-04-17 16:31:55.455247289 +0000 UTC m=+26.976824241" Apr 17 16:31:55.478090 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:55.476330 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tbsjm" podStartSLOduration=2.130369809 podStartE2EDuration="26.476311196s" podCreationTimestamp="2026-04-17 16:31:29 +0000 UTC" firstStartedPulling="2026-04-17 16:31:30.378347324 +0000 UTC m=+1.899924251" lastFinishedPulling="2026-04-17 16:31:54.72428871 +0000 UTC m=+26.245865638" observedRunningTime="2026-04-17 16:31:55.476182049 +0000 UTC m=+26.997759000" watchObservedRunningTime="2026-04-17 16:31:55.476311196 +0000 UTC m=+26.997888138" Apr 17 16:31:56.166060 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:56.165515 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:56.166060 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:56.165657 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:31:56.177243 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:56.177212 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-598xw"] Apr 17 16:31:56.180092 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:56.180046 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hqwh2"] Apr 17 16:31:56.180203 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:56.180164 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:56.180278 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:56.180253 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqwh2" podUID="ffde06b8-a22f-482c-89a5-3fa86598f73d" Apr 17 16:31:56.431115 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:56.431057 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:56.431639 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:56.431222 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:31:57.667916 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:57.667879 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vpndd" Apr 17 16:31:57.668524 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:57.668501 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vpndd" Apr 17 16:31:58.165416 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:58.165387 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:31:58.165416 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:58.165414 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:31:58.165629 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:58.165491 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqwh2" podUID="ffde06b8-a22f-482c-89a5-3fa86598f73d" Apr 17 16:31:58.165671 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:31:58.165629 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:31:58.438036 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:58.437954 2578 generic.go:358] "Generic (PLEG): container finished" podID="8039245d-5cc0-42eb-bd46-e84c3ff6d2dd" containerID="dae4c6d951d382d4e6d6c022ba38c287335efc385843442a0a00645c9c7d2b02" exitCode=0 Apr 17 16:31:58.438195 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:58.438045 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdzlp" event={"ID":"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd","Type":"ContainerDied","Data":"dae4c6d951d382d4e6d6c022ba38c287335efc385843442a0a00645c9c7d2b02"} Apr 17 16:31:58.438264 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:58.438226 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vpndd" Apr 17 16:31:58.438777 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:58.438740 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vpndd" Apr 17 16:31:59.441536 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:59.441326 2578 generic.go:358] "Generic (PLEG): container finished" podID="8039245d-5cc0-42eb-bd46-e84c3ff6d2dd" containerID="220a1ce0384212ca098253e15d3aefc1ca2ecd94f3381c96597325b1a7b1b16f" exitCode=0 Apr 17 16:31:59.442047 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:31:59.441415 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdzlp" event={"ID":"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd","Type":"ContainerDied","Data":"220a1ce0384212ca098253e15d3aefc1ca2ecd94f3381c96597325b1a7b1b16f"} Apr 17 16:32:00.165383 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:00.165356 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:32:00.165478 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:00.165364 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:32:00.165564 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:00.165547 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:32:00.165604 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:00.165448 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqwh2" podUID="ffde06b8-a22f-482c-89a5-3fa86598f73d" Apr 17 16:32:00.445798 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:00.445719 2578 generic.go:358] "Generic (PLEG): container finished" podID="8039245d-5cc0-42eb-bd46-e84c3ff6d2dd" containerID="cc4d9481e2f4c68efa889f98a87876b0e7f69c129f54cd789865ec5c015b6193" exitCode=0 Apr 17 16:32:00.445798 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:00.445765 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdzlp" event={"ID":"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd","Type":"ContainerDied","Data":"cc4d9481e2f4c68efa889f98a87876b0e7f69c129f54cd789865ec5c015b6193"} Apr 17 16:32:01.278863 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.278831 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-170.ec2.internal" event="NodeReady" Apr 17 16:32:01.279015 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.278938 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 16:32:01.322565 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.322537 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ghkgl"] Apr 17 16:32:01.325252 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.325233 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-lfzcd"] Apr 17 16:32:01.325387 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.325370 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:01.327599 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.327579 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dbwrt\"" Apr 17 16:32:01.327689 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.327602 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 16:32:01.327689 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.327621 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 16:32:01.327948 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.327935 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lfzcd" Apr 17 16:32:01.329965 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.329944 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 16:32:01.330056 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.329976 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ttskh\"" Apr 17 16:32:01.330056 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.330030 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 16:32:01.330163 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.330050 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 16:32:01.334637 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.334619 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ghkgl"] Apr 17 16:32:01.340351 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.340331 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lfzcd"] Apr 17 16:32:01.469260 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.469187 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:01.469260 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.469223 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk5dk\" (UniqueName: \"kubernetes.io/projected/c56755ae-c685-4cd5-a21d-9b2df9f5189f-kube-api-access-jk5dk\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:01.469260 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.469252 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c56755ae-c685-4cd5-a21d-9b2df9f5189f-tmp-dir\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:01.469745 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.469275 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert\") pod \"ingress-canary-lfzcd\" (UID: \"5eb99a8d-95ed-4e6b-8181-59a683f03f29\") " pod="openshift-ingress-canary/ingress-canary-lfzcd" Apr 17 16:32:01.469745 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.469329 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c56755ae-c685-4cd5-a21d-9b2df9f5189f-config-volume\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:01.469745 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.469352 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5crnd\" (UniqueName: \"kubernetes.io/projected/5eb99a8d-95ed-4e6b-8181-59a683f03f29-kube-api-access-5crnd\") pod \"ingress-canary-lfzcd\" (UID: \"5eb99a8d-95ed-4e6b-8181-59a683f03f29\") " pod="openshift-ingress-canary/ingress-canary-lfzcd" Apr 17 16:32:01.569560 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.569524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c56755ae-c685-4cd5-a21d-9b2df9f5189f-tmp-dir\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:01.569560 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.569564 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert\") pod \"ingress-canary-lfzcd\" (UID: \"5eb99a8d-95ed-4e6b-8181-59a683f03f29\") " pod="openshift-ingress-canary/ingress-canary-lfzcd" Apr 17 16:32:01.569697 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.569615 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c56755ae-c685-4cd5-a21d-9b2df9f5189f-config-volume\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:01.569697 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.569632 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5crnd\" (UniqueName: \"kubernetes.io/projected/5eb99a8d-95ed-4e6b-8181-59a683f03f29-kube-api-access-5crnd\") pod \"ingress-canary-lfzcd\" (UID: \"5eb99a8d-95ed-4e6b-8181-59a683f03f29\") " pod="openshift-ingress-canary/ingress-canary-lfzcd" Apr 17 16:32:01.569697 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.569648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:01.569697 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.569673 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jk5dk\" (UniqueName: \"kubernetes.io/projected/c56755ae-c685-4cd5-a21d-9b2df9f5189f-kube-api-access-jk5dk\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:01.569828 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:01.569786 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:01.569879 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:01.569841 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls podName:c56755ae-c685-4cd5-a21d-9b2df9f5189f nodeName:}" failed. No retries permitted until 2026-04-17 16:32:02.069825573 +0000 UTC m=+33.591402506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls") pod "dns-default-ghkgl" (UID: "c56755ae-c685-4cd5-a21d-9b2df9f5189f") : secret "dns-default-metrics-tls" not found Apr 17 16:32:01.569879 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.569847 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c56755ae-c685-4cd5-a21d-9b2df9f5189f-tmp-dir\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:01.569879 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:01.569859 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:01.569997 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:01.569912 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert podName:5eb99a8d-95ed-4e6b-8181-59a683f03f29 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:02.069895676 +0000 UTC m=+33.591472606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert") pod "ingress-canary-lfzcd" (UID: "5eb99a8d-95ed-4e6b-8181-59a683f03f29") : secret "canary-serving-cert" not found Apr 17 16:32:01.570387 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.570370 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c56755ae-c685-4cd5-a21d-9b2df9f5189f-config-volume\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:01.580294 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.580271 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk5dk\" (UniqueName: \"kubernetes.io/projected/c56755ae-c685-4cd5-a21d-9b2df9f5189f-kube-api-access-jk5dk\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:01.580383 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.580276 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5crnd\" (UniqueName: \"kubernetes.io/projected/5eb99a8d-95ed-4e6b-8181-59a683f03f29-kube-api-access-5crnd\") pod \"ingress-canary-lfzcd\" (UID: \"5eb99a8d-95ed-4e6b-8181-59a683f03f29\") " pod="openshift-ingress-canary/ingress-canary-lfzcd" Apr 17 16:32:01.771419 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.771336 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs\") pod \"network-metrics-daemon-598xw\" (UID: \"a6f8630a-c602-4066-a1c1-66f602f947fc\") " pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:32:01.771542 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:01.771491 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:32:01.771583 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:01.771556 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs podName:a6f8630a-c602-4066-a1c1-66f602f947fc nodeName:}" failed. No retries permitted until 2026-04-17 16:32:33.771538203 +0000 UTC m=+65.293115147 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs") pod "network-metrics-daemon-598xw" (UID: "a6f8630a-c602-4066-a1c1-66f602f947fc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:32:01.871645 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:01.871618 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsgsq\" (UniqueName: \"kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq\") pod \"network-check-target-hqwh2\" (UID: \"ffde06b8-a22f-482c-89a5-3fa86598f73d\") " pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:32:01.871762 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:01.871744 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:32:01.871762 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:01.871757 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:32:01.871827 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:01.871767 2578 projected.go:194] Error preparing data for projected volume kube-api-access-fsgsq for pod openshift-network-diagnostics/network-check-target-hqwh2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:32:01.871827 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:01.871811 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq podName:ffde06b8-a22f-482c-89a5-3fa86598f73d nodeName:}" failed. No retries permitted until 2026-04-17 16:32:33.871797854 +0000 UTC m=+65.393374786 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-fsgsq" (UniqueName: "kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq") pod "network-check-target-hqwh2" (UID: "ffde06b8-a22f-482c-89a5-3fa86598f73d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:32:02.072316 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:02.072286 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert\") pod \"ingress-canary-lfzcd\" (UID: \"5eb99a8d-95ed-4e6b-8181-59a683f03f29\") " pod="openshift-ingress-canary/ingress-canary-lfzcd" Apr 17 16:32:02.072438 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:02.072349 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:02.072476 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:02.072442 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:02.072476 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:02.072450 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:02.072537 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:02.072500 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert podName:5eb99a8d-95ed-4e6b-8181-59a683f03f29 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:03.072484775 +0000 UTC m=+34.594061710 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert") pod "ingress-canary-lfzcd" (UID: "5eb99a8d-95ed-4e6b-8181-59a683f03f29") : secret "canary-serving-cert" not found Apr 17 16:32:02.072537 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:02.072513 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls podName:c56755ae-c685-4cd5-a21d-9b2df9f5189f nodeName:}" failed. No retries permitted until 2026-04-17 16:32:03.072506931 +0000 UTC m=+34.594083859 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls") pod "dns-default-ghkgl" (UID: "c56755ae-c685-4cd5-a21d-9b2df9f5189f") : secret "dns-default-metrics-tls" not found Apr 17 16:32:02.166211 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:02.166182 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:32:02.166304 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:02.166183 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:32:02.168692 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:02.168671 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:32:02.168805 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:02.168675 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-26v9q\"" Apr 17 16:32:02.168805 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:02.168714 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:32:02.168805 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:02.168732 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:32:02.169199 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:02.169184 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xlgkd\"" Apr 17 16:32:03.077395 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:03.077363 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:03.077854 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:03.077406 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert\") pod \"ingress-canary-lfzcd\" (UID: \"5eb99a8d-95ed-4e6b-8181-59a683f03f29\") " pod="openshift-ingress-canary/ingress-canary-lfzcd" Apr 17 16:32:03.077854 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:03.077499 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:03.077854 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:03.077509 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:03.077854 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:03.077548 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert podName:5eb99a8d-95ed-4e6b-8181-59a683f03f29 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:05.077534728 +0000 UTC m=+36.599111655 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert") pod "ingress-canary-lfzcd" (UID: "5eb99a8d-95ed-4e6b-8181-59a683f03f29") : secret "canary-serving-cert" not found Apr 17 16:32:03.077854 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:03.077571 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls podName:c56755ae-c685-4cd5-a21d-9b2df9f5189f nodeName:}" failed. No retries permitted until 2026-04-17 16:32:05.077556684 +0000 UTC m=+36.599133613 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls") pod "dns-default-ghkgl" (UID: "c56755ae-c685-4cd5-a21d-9b2df9f5189f") : secret "dns-default-metrics-tls" not found Apr 17 16:32:05.090082 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:05.090042 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:05.090486 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:05.090140 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert\") pod \"ingress-canary-lfzcd\" (UID: \"5eb99a8d-95ed-4e6b-8181-59a683f03f29\") " pod="openshift-ingress-canary/ingress-canary-lfzcd" Apr 17 16:32:05.090486 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:05.090199 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:05.090486 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:05.090264 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls podName:c56755ae-c685-4cd5-a21d-9b2df9f5189f nodeName:}" failed. No retries permitted until 2026-04-17 16:32:09.090248548 +0000 UTC m=+40.611825477 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls") pod "dns-default-ghkgl" (UID: "c56755ae-c685-4cd5-a21d-9b2df9f5189f") : secret "dns-default-metrics-tls" not found Apr 17 16:32:05.090486 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:05.090279 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:05.090486 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:05.090324 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert podName:5eb99a8d-95ed-4e6b-8181-59a683f03f29 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:09.090311849 +0000 UTC m=+40.611888777 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert") pod "ingress-canary-lfzcd" (UID: "5eb99a8d-95ed-4e6b-8181-59a683f03f29") : secret "canary-serving-cert" not found Apr 17 16:32:09.117904 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:09.117729 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert\") pod \"ingress-canary-lfzcd\" (UID: \"5eb99a8d-95ed-4e6b-8181-59a683f03f29\") " pod="openshift-ingress-canary/ingress-canary-lfzcd" Apr 17 16:32:09.118354 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:09.117934 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:09.118354 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:09.117858 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:09.118354 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:09.118014 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:09.118354 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:09.118031 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert podName:5eb99a8d-95ed-4e6b-8181-59a683f03f29 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:17.118007702 +0000 UTC m=+48.639584631 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert") pod "ingress-canary-lfzcd" (UID: "5eb99a8d-95ed-4e6b-8181-59a683f03f29") : secret "canary-serving-cert" not found Apr 17 16:32:09.118354 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:09.118049 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls podName:c56755ae-c685-4cd5-a21d-9b2df9f5189f nodeName:}" failed. No retries permitted until 2026-04-17 16:32:17.118038933 +0000 UTC m=+48.639615864 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls") pod "dns-default-ghkgl" (UID: "c56755ae-c685-4cd5-a21d-9b2df9f5189f") : secret "dns-default-metrics-tls" not found Apr 17 16:32:10.795978 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:10.795920 2578 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df: reading manifest sha256:edd7b883364dcfd9a811079ba1b6106d36063c1dce522a7602a646fc54160570 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df" Apr 17 16:32:10.796416 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:10.796137 2578 kuberuntime_manager.go:1358] "Unhandled Error" err="init container &Container{Name:whereabouts-cni-bincopy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/whereabouts/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/whereabouts/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/whereabouts/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ljh8x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-gdzlp_openshift-multus(8039245d-5cc0-42eb-bd46-e84c3ff6d2dd): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df: reading manifest sha256:edd7b883364dcfd9a811079ba1b6106d36063c1dce522a7602a646fc54160570 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 16:32:10.797333 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:10.797304 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"whereabouts-cni-bincopy\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df: reading manifest sha256:edd7b883364dcfd9a811079ba1b6106d36063c1dce522a7602a646fc54160570 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-multus/multus-additional-cni-plugins-gdzlp" podUID="8039245d-5cc0-42eb-bd46-e84c3ff6d2dd" Apr 17 16:32:11.466575 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:11.466544 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"whereabouts-cni-bincopy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df: reading manifest sha256:edd7b883364dcfd9a811079ba1b6106d36063c1dce522a7602a646fc54160570 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-multus/multus-additional-cni-plugins-gdzlp" podUID="8039245d-5cc0-42eb-bd46-e84c3ff6d2dd" Apr 17 16:32:17.172518 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:17.172480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert\") pod \"ingress-canary-lfzcd\" (UID: \"5eb99a8d-95ed-4e6b-8181-59a683f03f29\") " pod="openshift-ingress-canary/ingress-canary-lfzcd" Apr 17 16:32:17.173001 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:17.172583 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:17.173001 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:17.172623 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:17.173001 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:17.172688 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert podName:5eb99a8d-95ed-4e6b-8181-59a683f03f29 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:33.172671457 +0000 UTC m=+64.694248384 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert") pod "ingress-canary-lfzcd" (UID: "5eb99a8d-95ed-4e6b-8181-59a683f03f29") : secret "canary-serving-cert" not found Apr 17 16:32:17.173001 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:17.172700 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:17.173001 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:17.172754 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls podName:c56755ae-c685-4cd5-a21d-9b2df9f5189f nodeName:}" failed. No retries permitted until 2026-04-17 16:32:33.172736 +0000 UTC m=+64.694312933 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls") pod "dns-default-ghkgl" (UID: "c56755ae-c685-4cd5-a21d-9b2df9f5189f") : secret "dns-default-metrics-tls" not found Apr 17 16:32:26.441534 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:26.441456 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8jknk" Apr 17 16:32:32.507993 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:32.507960 2578 generic.go:358] "Generic (PLEG): container finished" podID="8039245d-5cc0-42eb-bd46-e84c3ff6d2dd" containerID="bf786d9df3e86b42cf8b2b5ee4b376359beaa19bad8c3d8836067f4b9b890b23" exitCode=0 Apr 17 16:32:32.508403 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:32.508014 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdzlp" event={"ID":"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd","Type":"ContainerDied","Data":"bf786d9df3e86b42cf8b2b5ee4b376359beaa19bad8c3d8836067f4b9b890b23"} Apr 17 16:32:33.182437 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:33.182409 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:32:33.182604 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:33.182451 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert\") pod \"ingress-canary-lfzcd\" (UID: \"5eb99a8d-95ed-4e6b-8181-59a683f03f29\") " pod="openshift-ingress-canary/ingress-canary-lfzcd" Apr 17 16:32:33.182604 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:33.182538 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:33.182604 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:33.182540 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:33.182604 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:33.182584 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert podName:5eb99a8d-95ed-4e6b-8181-59a683f03f29 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:05.182571045 +0000 UTC m=+96.704147974 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert") pod "ingress-canary-lfzcd" (UID: "5eb99a8d-95ed-4e6b-8181-59a683f03f29") : secret "canary-serving-cert" not found Apr 17 16:32:33.182604 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:33.182596 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls podName:c56755ae-c685-4cd5-a21d-9b2df9f5189f nodeName:}" failed. No retries permitted until 2026-04-17 16:33:05.182590275 +0000 UTC m=+96.704167204 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls") pod "dns-default-ghkgl" (UID: "c56755ae-c685-4cd5-a21d-9b2df9f5189f") : secret "dns-default-metrics-tls" not found Apr 17 16:32:33.513038 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:33.512952 2578 generic.go:358] "Generic (PLEG): container finished" podID="8039245d-5cc0-42eb-bd46-e84c3ff6d2dd" containerID="911c5a641c1ce33773fced02a9939e7dd4f04a0c56e7ac728493b5a267459993" exitCode=0 Apr 17 16:32:33.513038 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:33.513004 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdzlp" event={"ID":"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd","Type":"ContainerDied","Data":"911c5a641c1ce33773fced02a9939e7dd4f04a0c56e7ac728493b5a267459993"} Apr 17 16:32:33.787330 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:33.787241 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs\") pod \"network-metrics-daemon-598xw\" (UID: \"a6f8630a-c602-4066-a1c1-66f602f947fc\") " pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:32:33.789734 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:33.789717 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:32:33.797708 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:33.797690 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:32:33.797790 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:32:33.797748 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs podName:a6f8630a-c602-4066-a1c1-66f602f947fc nodeName:}" failed. No retries permitted until 2026-04-17 16:33:37.797726732 +0000 UTC m=+129.319303665 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs") pod "network-metrics-daemon-598xw" (UID: "a6f8630a-c602-4066-a1c1-66f602f947fc") : secret "metrics-daemon-secret" not found Apr 17 16:32:33.888542 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:33.888503 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsgsq\" (UniqueName: \"kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq\") pod \"network-check-target-hqwh2\" (UID: \"ffde06b8-a22f-482c-89a5-3fa86598f73d\") " pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:32:33.891028 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:33.891012 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:32:33.901018 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:33.901001 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:32:33.912574 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:33.912552 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsgsq\" (UniqueName: \"kubernetes.io/projected/ffde06b8-a22f-482c-89a5-3fa86598f73d-kube-api-access-fsgsq\") pod \"network-check-target-hqwh2\" (UID: \"ffde06b8-a22f-482c-89a5-3fa86598f73d\") " pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:32:33.978335 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:33.978313 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-26v9q\"" Apr 17 16:32:33.987036 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:33.987006 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:32:34.112954 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:34.112928 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hqwh2"] Apr 17 16:32:34.116437 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:32:34.116410 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffde06b8_a22f_482c_89a5_3fa86598f73d.slice/crio-23389d720b627f400f204d45bac46fd75ba93b0574bb497a4c026c06ce4c4dff WatchSource:0}: Error finding container 23389d720b627f400f204d45bac46fd75ba93b0574bb497a4c026c06ce4c4dff: Status 404 returned error can't find the container with id 23389d720b627f400f204d45bac46fd75ba93b0574bb497a4c026c06ce4c4dff Apr 17 16:32:34.517123 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:34.517092 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdzlp" event={"ID":"8039245d-5cc0-42eb-bd46-e84c3ff6d2dd","Type":"ContainerStarted","Data":"2c534247e22f7fe8b0a1a205d2291668c9ab94daee103173cd66e7bec3ee238e"} Apr 17 16:32:34.518024 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:34.518003 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hqwh2" event={"ID":"ffde06b8-a22f-482c-89a5-3fa86598f73d","Type":"ContainerStarted","Data":"23389d720b627f400f204d45bac46fd75ba93b0574bb497a4c026c06ce4c4dff"} Apr 17 16:32:34.540308 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:34.540270 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gdzlp" podStartSLOduration=4.19284263 podStartE2EDuration="1m5.540256369s" podCreationTimestamp="2026-04-17 16:31:29 +0000 UTC" firstStartedPulling="2026-04-17 16:31:30.431318237 +0000 UTC m=+1.952895164" lastFinishedPulling="2026-04-17 16:32:31.778731974 +0000 UTC m=+63.300308903" observedRunningTime="2026-04-17 16:32:34.538450044 +0000 UTC m=+66.060026995" watchObservedRunningTime="2026-04-17 16:32:34.540256369 +0000 UTC m=+66.061833298" Apr 17 16:32:37.524889 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:37.524845 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hqwh2" event={"ID":"ffde06b8-a22f-482c-89a5-3fa86598f73d","Type":"ContainerStarted","Data":"67cb5cc97e993ce4c770a724a767f3c51fb4ef4ca6a1fee99bf7920d500782be"} Apr 17 16:32:37.525388 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:37.525026 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:32:37.542560 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:32:37.542513 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hqwh2" podStartSLOduration=65.730752016 podStartE2EDuration="1m8.54250165s" podCreationTimestamp="2026-04-17 16:31:29 +0000 UTC" firstStartedPulling="2026-04-17 16:32:34.118215164 +0000 UTC m=+65.639792092" lastFinishedPulling="2026-04-17 16:32:36.929964797 +0000 UTC m=+68.451541726" observedRunningTime="2026-04-17 16:32:37.541568654 +0000 UTC m=+69.063145604" watchObservedRunningTime="2026-04-17 16:32:37.54250165 +0000 UTC m=+69.064078670" Apr 17 16:33:05.198961 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:05.198815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert\") pod \"ingress-canary-lfzcd\" (UID: \"5eb99a8d-95ed-4e6b-8181-59a683f03f29\") " pod="openshift-ingress-canary/ingress-canary-lfzcd" Apr 17 16:33:05.199408 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:05.198990 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:33:05.199408 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:05.199145 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:33:05.199408 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:05.199165 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:33:05.199408 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:05.199234 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls podName:c56755ae-c685-4cd5-a21d-9b2df9f5189f nodeName:}" failed. No retries permitted until 2026-04-17 16:34:09.19921194 +0000 UTC m=+160.720788867 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls") pod "dns-default-ghkgl" (UID: "c56755ae-c685-4cd5-a21d-9b2df9f5189f") : secret "dns-default-metrics-tls" not found Apr 17 16:33:05.199408 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:05.199251 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert podName:5eb99a8d-95ed-4e6b-8181-59a683f03f29 nodeName:}" failed. No retries permitted until 2026-04-17 16:34:09.199242898 +0000 UTC m=+160.720819831 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert") pod "ingress-canary-lfzcd" (UID: "5eb99a8d-95ed-4e6b-8181-59a683f03f29") : secret "canary-serving-cert" not found Apr 17 16:33:08.529264 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:08.529233 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hqwh2" Apr 17 16:33:21.872220 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.872183 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-t7k46"] Apr 17 16:33:21.875038 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.875008 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zgh5t"] Apr 17 16:33:21.875212 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.875191 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:21.878477 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.878448 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:33:21.879004 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.878985 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 16:33:21.879681 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.879664 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:21.882860 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.882839 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 16:33:21.882978 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.882874 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 16:33:21.882978 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.882902 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 16:33:21.882978 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.882931 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 16:33:21.882978 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.882965 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 16:33:21.883603 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.883563 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 16:33:21.883932 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.883879 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-tkfq6\"" Apr 17 16:33:21.886134 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.886116 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-9v4ww\"" Apr 17 16:33:21.887392 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.887372 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-t7k46"] Apr 17 16:33:21.891837 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.891816 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 16:33:21.894414 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.894394 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 16:33:21.896901 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:21.896879 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zgh5t"] Apr 17 16:33:22.004616 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.004587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8989b18c-2718-4e13-895b-5944e510a981-config\") pod \"console-operator-9d4b6777b-t7k46\" (UID: \"8989b18c-2718-4e13-895b-5944e510a981\") " pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:22.004745 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.004624 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52c76994-eea6-40ad-81ff-21383f7c251b-serving-cert\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.004745 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.004641 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8989b18c-2718-4e13-895b-5944e510a981-trusted-ca\") pod \"console-operator-9d4b6777b-t7k46\" (UID: \"8989b18c-2718-4e13-895b-5944e510a981\") " pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:22.004745 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.004704 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52c76994-eea6-40ad-81ff-21383f7c251b-tmp\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.004745 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.004737 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52c76994-eea6-40ad-81ff-21383f7c251b-service-ca-bundle\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.004871 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.004788 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52c76994-eea6-40ad-81ff-21383f7c251b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.004871 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.004812 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bc9r\" (UniqueName: \"kubernetes.io/projected/52c76994-eea6-40ad-81ff-21383f7c251b-kube-api-access-7bc9r\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.004871 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.004845 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/52c76994-eea6-40ad-81ff-21383f7c251b-snapshots\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.004871 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.004863 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8989b18c-2718-4e13-895b-5944e510a981-serving-cert\") pod \"console-operator-9d4b6777b-t7k46\" (UID: \"8989b18c-2718-4e13-895b-5944e510a981\") " pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:22.004984 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.004878 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fq7q\" (UniqueName: \"kubernetes.io/projected/8989b18c-2718-4e13-895b-5944e510a981-kube-api-access-9fq7q\") pod \"console-operator-9d4b6777b-t7k46\" (UID: \"8989b18c-2718-4e13-895b-5944e510a981\") " pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:22.105119 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.105089 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52c76994-eea6-40ad-81ff-21383f7c251b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.105119 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.105120 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bc9r\" (UniqueName: \"kubernetes.io/projected/52c76994-eea6-40ad-81ff-21383f7c251b-kube-api-access-7bc9r\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.105282 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.105140 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/52c76994-eea6-40ad-81ff-21383f7c251b-snapshots\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.105282 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.105157 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8989b18c-2718-4e13-895b-5944e510a981-serving-cert\") pod \"console-operator-9d4b6777b-t7k46\" (UID: \"8989b18c-2718-4e13-895b-5944e510a981\") " pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:22.105368 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.105278 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fq7q\" (UniqueName: \"kubernetes.io/projected/8989b18c-2718-4e13-895b-5944e510a981-kube-api-access-9fq7q\") pod \"console-operator-9d4b6777b-t7k46\" (UID: \"8989b18c-2718-4e13-895b-5944e510a981\") " pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:22.105368 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.105352 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8989b18c-2718-4e13-895b-5944e510a981-config\") pod \"console-operator-9d4b6777b-t7k46\" (UID: \"8989b18c-2718-4e13-895b-5944e510a981\") " pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:22.105462 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.105392 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52c76994-eea6-40ad-81ff-21383f7c251b-serving-cert\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.105462 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.105417 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8989b18c-2718-4e13-895b-5944e510a981-trusted-ca\") pod \"console-operator-9d4b6777b-t7k46\" (UID: \"8989b18c-2718-4e13-895b-5944e510a981\") " pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:22.105554 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.105470 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52c76994-eea6-40ad-81ff-21383f7c251b-tmp\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.105554 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.105495 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52c76994-eea6-40ad-81ff-21383f7c251b-service-ca-bundle\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.105982 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.105926 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52c76994-eea6-40ad-81ff-21383f7c251b-tmp\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.105982 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.105952 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/52c76994-eea6-40ad-81ff-21383f7c251b-snapshots\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.106240 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.106219 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8989b18c-2718-4e13-895b-5944e510a981-config\") pod \"console-operator-9d4b6777b-t7k46\" (UID: \"8989b18c-2718-4e13-895b-5944e510a981\") " pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:22.106339 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.106220 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52c76994-eea6-40ad-81ff-21383f7c251b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.106432 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.106414 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52c76994-eea6-40ad-81ff-21383f7c251b-service-ca-bundle\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.106559 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.106539 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8989b18c-2718-4e13-895b-5944e510a981-trusted-ca\") pod \"console-operator-9d4b6777b-t7k46\" (UID: \"8989b18c-2718-4e13-895b-5944e510a981\") " pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:22.107659 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.107638 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8989b18c-2718-4e13-895b-5944e510a981-serving-cert\") pod \"console-operator-9d4b6777b-t7k46\" (UID: \"8989b18c-2718-4e13-895b-5944e510a981\") " pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:22.108055 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.108039 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52c76994-eea6-40ad-81ff-21383f7c251b-serving-cert\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.113491 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.113468 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bc9r\" (UniqueName: \"kubernetes.io/projected/52c76994-eea6-40ad-81ff-21383f7c251b-kube-api-access-7bc9r\") pod \"insights-operator-585dfdc468-zgh5t\" (UID: \"52c76994-eea6-40ad-81ff-21383f7c251b\") " pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.113884 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.113868 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fq7q\" (UniqueName: \"kubernetes.io/projected/8989b18c-2718-4e13-895b-5944e510a981-kube-api-access-9fq7q\") pod \"console-operator-9d4b6777b-t7k46\" (UID: \"8989b18c-2718-4e13-895b-5944e510a981\") " pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:22.186665 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.186605 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:22.193222 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.193201 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-zgh5t" Apr 17 16:33:22.326338 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.326305 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-t7k46"] Apr 17 16:33:22.329645 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:33:22.329622 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8989b18c_2718_4e13_895b_5944e510a981.slice/crio-90d1d8c7261114cc2977e6a63e7d1e9b8bfde57d2232bf6123910ee116a8b56d WatchSource:0}: Error finding container 90d1d8c7261114cc2977e6a63e7d1e9b8bfde57d2232bf6123910ee116a8b56d: Status 404 returned error can't find the container with id 90d1d8c7261114cc2977e6a63e7d1e9b8bfde57d2232bf6123910ee116a8b56d Apr 17 16:33:22.340246 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.340223 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zgh5t"] Apr 17 16:33:22.342971 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:33:22.342946 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52c76994_eea6_40ad_81ff_21383f7c251b.slice/crio-56ac4e775483e8f782a699af1bd6c488036b5510ef38493b0974c320afefda74 WatchSource:0}: Error finding container 56ac4e775483e8f782a699af1bd6c488036b5510ef38493b0974c320afefda74: Status 404 returned error can't find the container with id 56ac4e775483e8f782a699af1bd6c488036b5510ef38493b0974c320afefda74 Apr 17 16:33:22.382830 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.382806 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-5z9sm"] Apr 17 16:33:22.386871 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.386857 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5z9sm" Apr 17 16:33:22.389057 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.389040 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-mwbl6\"" Apr 17 16:33:22.395049 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.395030 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-5z9sm"] Apr 17 16:33:22.511807 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.511750 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pjf5\" (UniqueName: \"kubernetes.io/projected/921f716a-048a-4236-8a81-8bb9b570e437-kube-api-access-5pjf5\") pod \"network-check-source-8894fc9bd-5z9sm\" (UID: \"921f716a-048a-4236-8a81-8bb9b570e437\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5z9sm" Apr 17 16:33:22.612112 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.612088 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pjf5\" (UniqueName: \"kubernetes.io/projected/921f716a-048a-4236-8a81-8bb9b570e437-kube-api-access-5pjf5\") pod \"network-check-source-8894fc9bd-5z9sm\" (UID: \"921f716a-048a-4236-8a81-8bb9b570e437\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5z9sm" Apr 17 16:33:22.612214 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.612167 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zgh5t" event={"ID":"52c76994-eea6-40ad-81ff-21383f7c251b","Type":"ContainerStarted","Data":"56ac4e775483e8f782a699af1bd6c488036b5510ef38493b0974c320afefda74"} Apr 17 16:33:22.613058 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.613038 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" event={"ID":"8989b18c-2718-4e13-895b-5944e510a981","Type":"ContainerStarted","Data":"90d1d8c7261114cc2977e6a63e7d1e9b8bfde57d2232bf6123910ee116a8b56d"} Apr 17 16:33:22.620110 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.620092 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pjf5\" (UniqueName: \"kubernetes.io/projected/921f716a-048a-4236-8a81-8bb9b570e437-kube-api-access-5pjf5\") pod \"network-check-source-8894fc9bd-5z9sm\" (UID: \"921f716a-048a-4236-8a81-8bb9b570e437\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5z9sm" Apr 17 16:33:22.695247 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.695228 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5z9sm" Apr 17 16:33:22.819087 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:22.819036 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-5z9sm"] Apr 17 16:33:22.823299 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:33:22.823267 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod921f716a_048a_4236_8a81_8bb9b570e437.slice/crio-d1992d41244d2c2a1c133782f01d32cfa8bf2c2b2a25012358658bedd2e8464d WatchSource:0}: Error finding container d1992d41244d2c2a1c133782f01d32cfa8bf2c2b2a25012358658bedd2e8464d: Status 404 returned error can't find the container with id d1992d41244d2c2a1c133782f01d32cfa8bf2c2b2a25012358658bedd2e8464d Apr 17 16:33:23.617418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:23.617377 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5z9sm" event={"ID":"921f716a-048a-4236-8a81-8bb9b570e437","Type":"ContainerStarted","Data":"5140c394497366cd09706d93f49b306d7103cebdb3d95240ed2f050649da8dc7"} Apr 17 16:33:23.617785 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:23.617495 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5z9sm" event={"ID":"921f716a-048a-4236-8a81-8bb9b570e437","Type":"ContainerStarted","Data":"d1992d41244d2c2a1c133782f01d32cfa8bf2c2b2a25012358658bedd2e8464d"} Apr 17 16:33:23.634701 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:23.634391 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5z9sm" podStartSLOduration=1.634374719 podStartE2EDuration="1.634374719s" podCreationTimestamp="2026-04-17 16:33:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:33:23.634282408 +0000 UTC m=+115.155859359" watchObservedRunningTime="2026-04-17 16:33:23.634374719 +0000 UTC m=+115.155951669" Apr 17 16:33:25.622698 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:25.622662 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zgh5t" event={"ID":"52c76994-eea6-40ad-81ff-21383f7c251b","Type":"ContainerStarted","Data":"a2dd5a0e3a22a67f3b66c8ac7d1448c3d1725e10cbff35a7dc4436bdcd4a78ef"} Apr 17 16:33:25.624123 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:25.624104 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/0.log" Apr 17 16:33:25.624215 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:25.624143 2578 generic.go:358] "Generic (PLEG): container finished" podID="8989b18c-2718-4e13-895b-5944e510a981" containerID="2f138fed331bb3accc7f7de499e6acef00e60687f0aaa09c07602221617b7bb5" exitCode=255 Apr 17 16:33:25.624249 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:25.624213 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" event={"ID":"8989b18c-2718-4e13-895b-5944e510a981","Type":"ContainerDied","Data":"2f138fed331bb3accc7f7de499e6acef00e60687f0aaa09c07602221617b7bb5"} Apr 17 16:33:25.624377 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:25.624366 2578 scope.go:117] "RemoveContainer" containerID="2f138fed331bb3accc7f7de499e6acef00e60687f0aaa09c07602221617b7bb5" Apr 17 16:33:25.637604 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:25.637563 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-zgh5t" podStartSLOduration=2.366950247 podStartE2EDuration="4.637550582s" podCreationTimestamp="2026-04-17 16:33:21 +0000 UTC" firstStartedPulling="2026-04-17 16:33:22.34465727 +0000 UTC m=+113.866234198" lastFinishedPulling="2026-04-17 16:33:24.615257588 +0000 UTC m=+116.136834533" observedRunningTime="2026-04-17 16:33:25.636991286 +0000 UTC m=+117.158568251" watchObservedRunningTime="2026-04-17 16:33:25.637550582 +0000 UTC m=+117.159127531" Apr 17 16:33:26.627622 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:26.627594 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/1.log" Apr 17 16:33:26.628003 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:26.627917 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/0.log" Apr 17 16:33:26.628003 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:26.627947 2578 generic.go:358] "Generic (PLEG): container finished" podID="8989b18c-2718-4e13-895b-5944e510a981" containerID="fdabee3320a36c83dce681d2dec88e21dfa8174e0a32e2d78df73ce6e129d79f" exitCode=255 Apr 17 16:33:26.628106 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:26.628046 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" event={"ID":"8989b18c-2718-4e13-895b-5944e510a981","Type":"ContainerDied","Data":"fdabee3320a36c83dce681d2dec88e21dfa8174e0a32e2d78df73ce6e129d79f"} Apr 17 16:33:26.628146 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:26.628117 2578 scope.go:117] "RemoveContainer" containerID="2f138fed331bb3accc7f7de499e6acef00e60687f0aaa09c07602221617b7bb5" Apr 17 16:33:26.628292 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:26.628271 2578 scope.go:117] "RemoveContainer" containerID="fdabee3320a36c83dce681d2dec88e21dfa8174e0a32e2d78df73ce6e129d79f" Apr 17 16:33:26.628511 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:26.628490 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-t7k46_openshift-console-operator(8989b18c-2718-4e13-895b-5944e510a981)\"" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" podUID="8989b18c-2718-4e13-895b-5944e510a981" Apr 17 16:33:27.266462 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.266430 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr"] Apr 17 16:33:27.270312 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.270297 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr" Apr 17 16:33:27.272509 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.272477 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 16:33:27.272509 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.272495 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 16:33:27.272687 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.272512 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-vh9qs\"" Apr 17 16:33:27.276275 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.276255 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr"] Apr 17 16:33:27.351375 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.351343 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/70703068-0a34-4ea5-8d18-b0d1a8b73858-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-dcvbr\" (UID: \"70703068-0a34-4ea5-8d18-b0d1a8b73858\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr" Apr 17 16:33:27.351530 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.351407 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dcvbr\" (UID: \"70703068-0a34-4ea5-8d18-b0d1a8b73858\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr" Apr 17 16:33:27.452156 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.452121 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/70703068-0a34-4ea5-8d18-b0d1a8b73858-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-dcvbr\" (UID: \"70703068-0a34-4ea5-8d18-b0d1a8b73858\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr" Apr 17 16:33:27.452285 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.452178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dcvbr\" (UID: \"70703068-0a34-4ea5-8d18-b0d1a8b73858\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr" Apr 17 16:33:27.452285 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:27.452269 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:33:27.452390 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:27.452321 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert podName:70703068-0a34-4ea5-8d18-b0d1a8b73858 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:27.952306294 +0000 UTC m=+119.473883223 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dcvbr" (UID: "70703068-0a34-4ea5-8d18-b0d1a8b73858") : secret "networking-console-plugin-cert" not found Apr 17 16:33:27.453272 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.453251 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/70703068-0a34-4ea5-8d18-b0d1a8b73858-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-dcvbr\" (UID: \"70703068-0a34-4ea5-8d18-b0d1a8b73858\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr" Apr 17 16:33:27.631421 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.631400 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/1.log" Apr 17 16:33:27.631759 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.631731 2578 scope.go:117] "RemoveContainer" containerID="fdabee3320a36c83dce681d2dec88e21dfa8174e0a32e2d78df73ce6e129d79f" Apr 17 16:33:27.631909 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:27.631891 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-t7k46_openshift-console-operator(8989b18c-2718-4e13-895b-5944e510a981)\"" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" podUID="8989b18c-2718-4e13-895b-5944e510a981" Apr 17 16:33:27.681226 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.681200 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7l9qg_61ffcc07-b8ef-4fcc-ab95-d8a4d75484df/dns-node-resolver/0.log" Apr 17 16:33:27.720191 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.720166 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dpq59"] Apr 17 16:33:27.723120 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.723104 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dpq59" Apr 17 16:33:27.726870 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.726850 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 16:33:27.726965 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.726888 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 16:33:27.726965 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.726951 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-qrkps\"" Apr 17 16:33:27.731760 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.731741 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dpq59"] Apr 17 16:33:27.856272 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.856242 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmqkm\" (UniqueName: \"kubernetes.io/projected/112efc8b-0a44-456a-8159-1b69f0cd48ea-kube-api-access-pmqkm\") pod \"migrator-74bb7799d9-dpq59\" (UID: \"112efc8b-0a44-456a-8159-1b69f0cd48ea\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dpq59" Apr 17 16:33:27.956784 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.956727 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmqkm\" (UniqueName: \"kubernetes.io/projected/112efc8b-0a44-456a-8159-1b69f0cd48ea-kube-api-access-pmqkm\") pod \"migrator-74bb7799d9-dpq59\" (UID: \"112efc8b-0a44-456a-8159-1b69f0cd48ea\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dpq59" Apr 17 16:33:27.956784 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.956770 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dcvbr\" (UID: \"70703068-0a34-4ea5-8d18-b0d1a8b73858\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr" Apr 17 16:33:27.956914 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:27.956857 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:33:27.956914 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:27.956903 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert podName:70703068-0a34-4ea5-8d18-b0d1a8b73858 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:28.9568897 +0000 UTC m=+120.478466627 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dcvbr" (UID: "70703068-0a34-4ea5-8d18-b0d1a8b73858") : secret "networking-console-plugin-cert" not found Apr 17 16:33:27.964529 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:27.964506 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmqkm\" (UniqueName: \"kubernetes.io/projected/112efc8b-0a44-456a-8159-1b69f0cd48ea-kube-api-access-pmqkm\") pod \"migrator-74bb7799d9-dpq59\" (UID: \"112efc8b-0a44-456a-8159-1b69f0cd48ea\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dpq59" Apr 17 16:33:28.031727 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:28.031706 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dpq59" Apr 17 16:33:28.144838 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:28.144811 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dpq59"] Apr 17 16:33:28.149621 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:33:28.149594 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod112efc8b_0a44_456a_8159_1b69f0cd48ea.slice/crio-cfa7ccf684dfba770d76deda202c8a912336a924707ce652ec915c2faab5914c WatchSource:0}: Error finding container cfa7ccf684dfba770d76deda202c8a912336a924707ce652ec915c2faab5914c: Status 404 returned error can't find the container with id cfa7ccf684dfba770d76deda202c8a912336a924707ce652ec915c2faab5914c Apr 17 16:33:28.635024 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:28.634995 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dpq59" event={"ID":"112efc8b-0a44-456a-8159-1b69f0cd48ea","Type":"ContainerStarted","Data":"cfa7ccf684dfba770d76deda202c8a912336a924707ce652ec915c2faab5914c"} Apr 17 16:33:28.886986 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:28.886906 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5ft4z_eb979380-a8c1-43a4-b8ad-f3ba0967a2d7/node-ca/0.log" Apr 17 16:33:28.964128 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:28.964089 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dcvbr\" (UID: \"70703068-0a34-4ea5-8d18-b0d1a8b73858\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr" Apr 17 16:33:28.964283 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:28.964170 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:33:28.964283 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:28.964262 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert podName:70703068-0a34-4ea5-8d18-b0d1a8b73858 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:30.964241543 +0000 UTC m=+122.485818494 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dcvbr" (UID: "70703068-0a34-4ea5-8d18-b0d1a8b73858") : secret "networking-console-plugin-cert" not found Apr 17 16:33:29.639524 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:29.639487 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dpq59" event={"ID":"112efc8b-0a44-456a-8159-1b69f0cd48ea","Type":"ContainerStarted","Data":"f815694fc441a562c83fcd5cb8c680996b6ca721d52ff7e6c3296c5e6dea340b"} Apr 17 16:33:29.639524 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:29.639527 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dpq59" event={"ID":"112efc8b-0a44-456a-8159-1b69f0cd48ea","Type":"ContainerStarted","Data":"207f31a7857cc134748a835565aab8068968d2ab07daf73b47cd44864628f1eb"} Apr 17 16:33:29.658270 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:29.658217 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dpq59" podStartSLOduration=1.6231868459999998 podStartE2EDuration="2.658203434s" podCreationTimestamp="2026-04-17 16:33:27 +0000 UTC" firstStartedPulling="2026-04-17 16:33:28.151912703 +0000 UTC m=+119.673489631" lastFinishedPulling="2026-04-17 16:33:29.186929291 +0000 UTC m=+120.708506219" observedRunningTime="2026-04-17 16:33:29.656825943 +0000 UTC m=+121.178402894" watchObservedRunningTime="2026-04-17 16:33:29.658203434 +0000 UTC m=+121.179780384" Apr 17 16:33:30.979909 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:30.979867 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dcvbr\" (UID: \"70703068-0a34-4ea5-8d18-b0d1a8b73858\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr" Apr 17 16:33:30.980316 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:30.980002 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:33:30.980316 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:30.980093 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert podName:70703068-0a34-4ea5-8d18-b0d1a8b73858 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:34.980057728 +0000 UTC m=+126.501634655 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dcvbr" (UID: "70703068-0a34-4ea5-8d18-b0d1a8b73858") : secret "networking-console-plugin-cert" not found Apr 17 16:33:32.187018 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:32.186987 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:32.187018 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:32.187023 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:32.187438 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:32.187372 2578 scope.go:117] "RemoveContainer" containerID="fdabee3320a36c83dce681d2dec88e21dfa8174e0a32e2d78df73ce6e129d79f" Apr 17 16:33:32.187555 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:32.187537 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-t7k46_openshift-console-operator(8989b18c-2718-4e13-895b-5944e510a981)\"" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" podUID="8989b18c-2718-4e13-895b-5944e510a981" Apr 17 16:33:35.007410 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:35.007375 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dcvbr\" (UID: \"70703068-0a34-4ea5-8d18-b0d1a8b73858\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr" Apr 17 16:33:35.007774 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:35.007531 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:33:35.007774 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:35.007598 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert podName:70703068-0a34-4ea5-8d18-b0d1a8b73858 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:43.007580569 +0000 UTC m=+134.529157498 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dcvbr" (UID: "70703068-0a34-4ea5-8d18-b0d1a8b73858") : secret "networking-console-plugin-cert" not found Apr 17 16:33:37.826443 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:37.826401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs\") pod \"network-metrics-daemon-598xw\" (UID: \"a6f8630a-c602-4066-a1c1-66f602f947fc\") " pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:33:37.826815 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:37.826540 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:33:37.826815 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:37.826606 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs podName:a6f8630a-c602-4066-a1c1-66f602f947fc nodeName:}" failed. No retries permitted until 2026-04-17 16:35:39.826589874 +0000 UTC m=+251.348166801 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs") pod "network-metrics-daemon-598xw" (UID: "a6f8630a-c602-4066-a1c1-66f602f947fc") : secret "metrics-daemon-secret" not found Apr 17 16:33:43.068368 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:43.068324 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dcvbr\" (UID: \"70703068-0a34-4ea5-8d18-b0d1a8b73858\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr" Apr 17 16:33:43.070973 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:43.070939 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/70703068-0a34-4ea5-8d18-b0d1a8b73858-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dcvbr\" (UID: \"70703068-0a34-4ea5-8d18-b0d1a8b73858\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr" Apr 17 16:33:43.181411 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:43.181380 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-vh9qs\"" Apr 17 16:33:43.189437 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:43.189415 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr" Apr 17 16:33:43.311918 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:43.311884 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr"] Apr 17 16:33:43.316490 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:33:43.316463 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70703068_0a34_4ea5_8d18_b0d1a8b73858.slice/crio-2723b00d52b9ba4085c5e459146889b100cfb08a40fc8b1757f4398bcd36bed4 WatchSource:0}: Error finding container 2723b00d52b9ba4085c5e459146889b100cfb08a40fc8b1757f4398bcd36bed4: Status 404 returned error can't find the container with id 2723b00d52b9ba4085c5e459146889b100cfb08a40fc8b1757f4398bcd36bed4 Apr 17 16:33:43.673260 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:43.673170 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr" event={"ID":"70703068-0a34-4ea5-8d18-b0d1a8b73858","Type":"ContainerStarted","Data":"2723b00d52b9ba4085c5e459146889b100cfb08a40fc8b1757f4398bcd36bed4"} Apr 17 16:33:44.677027 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:44.676998 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr" event={"ID":"70703068-0a34-4ea5-8d18-b0d1a8b73858","Type":"ContainerStarted","Data":"b50e8518e47cb5686ae7942a583345399e964000c8baa3567b0ec36c78e77fea"} Apr 17 16:33:44.695802 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:44.695741 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dcvbr" podStartSLOduration=16.81520506 podStartE2EDuration="17.695725195s" podCreationTimestamp="2026-04-17 16:33:27 +0000 UTC" firstStartedPulling="2026-04-17 16:33:43.318427531 +0000 UTC m=+134.840004472" lastFinishedPulling="2026-04-17 16:33:44.198947677 +0000 UTC m=+135.720524607" observedRunningTime="2026-04-17 16:33:44.695566006 +0000 UTC m=+136.217142958" watchObservedRunningTime="2026-04-17 16:33:44.695725195 +0000 UTC m=+136.217302149" Apr 17 16:33:46.166313 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:46.166280 2578 scope.go:117] "RemoveContainer" containerID="fdabee3320a36c83dce681d2dec88e21dfa8174e0a32e2d78df73ce6e129d79f" Apr 17 16:33:46.683322 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:46.683295 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 16:33:46.683679 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:46.683661 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/1.log" Apr 17 16:33:46.683726 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:46.683696 2578 generic.go:358] "Generic (PLEG): container finished" podID="8989b18c-2718-4e13-895b-5944e510a981" containerID="7754fafefd0dfb55497fc3fa422fba5d0ad123f7a0a47e16c171f76cd46b78e8" exitCode=255 Apr 17 16:33:46.683765 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:46.683745 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" event={"ID":"8989b18c-2718-4e13-895b-5944e510a981","Type":"ContainerDied","Data":"7754fafefd0dfb55497fc3fa422fba5d0ad123f7a0a47e16c171f76cd46b78e8"} Apr 17 16:33:46.683798 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:46.683771 2578 scope.go:117] "RemoveContainer" containerID="fdabee3320a36c83dce681d2dec88e21dfa8174e0a32e2d78df73ce6e129d79f" Apr 17 16:33:46.684200 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:46.684180 2578 scope.go:117] "RemoveContainer" containerID="7754fafefd0dfb55497fc3fa422fba5d0ad123f7a0a47e16c171f76cd46b78e8" Apr 17 16:33:46.684467 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:46.684449 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-t7k46_openshift-console-operator(8989b18c-2718-4e13-895b-5944e510a981)\"" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" podUID="8989b18c-2718-4e13-895b-5944e510a981" Apr 17 16:33:47.687192 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:47.687157 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 16:33:47.956205 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:47.956128 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-q8sb6"] Apr 17 16:33:47.960489 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:47.960466 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q8sb6" Apr 17 16:33:47.963393 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:47.963374 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-pr7rd\"" Apr 17 16:33:47.964232 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:47.964173 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 16:33:47.964232 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:47.964216 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 16:33:47.991763 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:47.991737 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q8sb6"] Apr 17 16:33:48.106323 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.106292 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dc0f2f40-0d74-466f-8161-40616ba653a0-data-volume\") pod \"insights-runtime-extractor-q8sb6\" (UID: \"dc0f2f40-0d74-466f-8161-40616ba653a0\") " pod="openshift-insights/insights-runtime-extractor-q8sb6" Apr 17 16:33:48.106474 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.106335 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqk4g\" (UniqueName: \"kubernetes.io/projected/dc0f2f40-0d74-466f-8161-40616ba653a0-kube-api-access-qqk4g\") pod \"insights-runtime-extractor-q8sb6\" (UID: \"dc0f2f40-0d74-466f-8161-40616ba653a0\") " pod="openshift-insights/insights-runtime-extractor-q8sb6" Apr 17 16:33:48.106474 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.106397 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dc0f2f40-0d74-466f-8161-40616ba653a0-crio-socket\") pod \"insights-runtime-extractor-q8sb6\" (UID: \"dc0f2f40-0d74-466f-8161-40616ba653a0\") " pod="openshift-insights/insights-runtime-extractor-q8sb6" Apr 17 16:33:48.106474 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.106416 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dc0f2f40-0d74-466f-8161-40616ba653a0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q8sb6\" (UID: \"dc0f2f40-0d74-466f-8161-40616ba653a0\") " pod="openshift-insights/insights-runtime-extractor-q8sb6" Apr 17 16:33:48.106596 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.106477 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dc0f2f40-0d74-466f-8161-40616ba653a0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q8sb6\" (UID: \"dc0f2f40-0d74-466f-8161-40616ba653a0\") " pod="openshift-insights/insights-runtime-extractor-q8sb6" Apr 17 16:33:48.206979 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.206913 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dc0f2f40-0d74-466f-8161-40616ba653a0-data-volume\") pod \"insights-runtime-extractor-q8sb6\" (UID: \"dc0f2f40-0d74-466f-8161-40616ba653a0\") " pod="openshift-insights/insights-runtime-extractor-q8sb6" Apr 17 16:33:48.206979 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.206950 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqk4g\" (UniqueName: \"kubernetes.io/projected/dc0f2f40-0d74-466f-8161-40616ba653a0-kube-api-access-qqk4g\") pod \"insights-runtime-extractor-q8sb6\" (UID: \"dc0f2f40-0d74-466f-8161-40616ba653a0\") " pod="openshift-insights/insights-runtime-extractor-q8sb6" Apr 17 16:33:48.207179 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.207107 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dc0f2f40-0d74-466f-8161-40616ba653a0-crio-socket\") pod \"insights-runtime-extractor-q8sb6\" (UID: \"dc0f2f40-0d74-466f-8161-40616ba653a0\") " pod="openshift-insights/insights-runtime-extractor-q8sb6" Apr 17 16:33:48.207179 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.207138 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dc0f2f40-0d74-466f-8161-40616ba653a0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q8sb6\" (UID: \"dc0f2f40-0d74-466f-8161-40616ba653a0\") " pod="openshift-insights/insights-runtime-extractor-q8sb6" Apr 17 16:33:48.207179 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.207171 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dc0f2f40-0d74-466f-8161-40616ba653a0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q8sb6\" (UID: \"dc0f2f40-0d74-466f-8161-40616ba653a0\") " pod="openshift-insights/insights-runtime-extractor-q8sb6" Apr 17 16:33:48.207317 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.207222 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dc0f2f40-0d74-466f-8161-40616ba653a0-crio-socket\") pod \"insights-runtime-extractor-q8sb6\" (UID: \"dc0f2f40-0d74-466f-8161-40616ba653a0\") " pod="openshift-insights/insights-runtime-extractor-q8sb6" Apr 17 16:33:48.207317 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.207230 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dc0f2f40-0d74-466f-8161-40616ba653a0-data-volume\") pod \"insights-runtime-extractor-q8sb6\" (UID: \"dc0f2f40-0d74-466f-8161-40616ba653a0\") " pod="openshift-insights/insights-runtime-extractor-q8sb6" Apr 17 16:33:48.207674 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.207654 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dc0f2f40-0d74-466f-8161-40616ba653a0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q8sb6\" (UID: \"dc0f2f40-0d74-466f-8161-40616ba653a0\") " pod="openshift-insights/insights-runtime-extractor-q8sb6" Apr 17 16:33:48.209647 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.209627 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dc0f2f40-0d74-466f-8161-40616ba653a0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q8sb6\" (UID: \"dc0f2f40-0d74-466f-8161-40616ba653a0\") " pod="openshift-insights/insights-runtime-extractor-q8sb6" Apr 17 16:33:48.220471 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.220446 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqk4g\" (UniqueName: \"kubernetes.io/projected/dc0f2f40-0d74-466f-8161-40616ba653a0-kube-api-access-qqk4g\") pod \"insights-runtime-extractor-q8sb6\" (UID: \"dc0f2f40-0d74-466f-8161-40616ba653a0\") " pod="openshift-insights/insights-runtime-extractor-q8sb6" Apr 17 16:33:48.269536 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.269516 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q8sb6" Apr 17 16:33:48.396337 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.396310 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q8sb6"] Apr 17 16:33:48.399788 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:33:48.399766 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc0f2f40_0d74_466f_8161_40616ba653a0.slice/crio-0154ae657bedaa3e2c24df491fe4bfcc8f21c5cd7e1d8bdf2ab4f1bc290e5bff WatchSource:0}: Error finding container 0154ae657bedaa3e2c24df491fe4bfcc8f21c5cd7e1d8bdf2ab4f1bc290e5bff: Status 404 returned error can't find the container with id 0154ae657bedaa3e2c24df491fe4bfcc8f21c5cd7e1d8bdf2ab4f1bc290e5bff Apr 17 16:33:48.691964 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.691931 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q8sb6" event={"ID":"dc0f2f40-0d74-466f-8161-40616ba653a0","Type":"ContainerStarted","Data":"2d801a144753b0ed5bb35c8f43da32e0be428707af00c973bec88770a10012ef"} Apr 17 16:33:48.691964 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:48.691969 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q8sb6" event={"ID":"dc0f2f40-0d74-466f-8161-40616ba653a0","Type":"ContainerStarted","Data":"0154ae657bedaa3e2c24df491fe4bfcc8f21c5cd7e1d8bdf2ab4f1bc290e5bff"} Apr 17 16:33:49.695795 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:49.695763 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q8sb6" event={"ID":"dc0f2f40-0d74-466f-8161-40616ba653a0","Type":"ContainerStarted","Data":"5c7079fdbcf53e40b067d1c940c7d5dd26bf7d9b7d6d4fe0b1d6f9730572736c"} Apr 17 16:33:51.704535 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:51.704496 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q8sb6" event={"ID":"dc0f2f40-0d74-466f-8161-40616ba653a0","Type":"ContainerStarted","Data":"e86d68fdc1ce7c5f44433a1a377c2fcbaa698ec26f1f1746a7208a4fbe9c76ab"} Apr 17 16:33:51.726850 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:51.726805 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-q8sb6" podStartSLOduration=2.615580484 podStartE2EDuration="4.72679141s" podCreationTimestamp="2026-04-17 16:33:47 +0000 UTC" firstStartedPulling="2026-04-17 16:33:48.531993445 +0000 UTC m=+140.053570373" lastFinishedPulling="2026-04-17 16:33:50.643204371 +0000 UTC m=+142.164781299" observedRunningTime="2026-04-17 16:33:51.725480568 +0000 UTC m=+143.247057517" watchObservedRunningTime="2026-04-17 16:33:51.72679141 +0000 UTC m=+143.248368361" Apr 17 16:33:52.187181 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:52.187152 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:52.187181 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:52.187186 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:33:52.187528 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:52.187503 2578 scope.go:117] "RemoveContainer" containerID="7754fafefd0dfb55497fc3fa422fba5d0ad123f7a0a47e16c171f76cd46b78e8" Apr 17 16:33:52.187676 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:52.187660 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-t7k46_openshift-console-operator(8989b18c-2718-4e13-895b-5944e510a981)\"" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" podUID="8989b18c-2718-4e13-895b-5944e510a981" Apr 17 16:33:56.665622 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:56.665592 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r2s7q"] Apr 17 16:33:56.668710 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:56.668693 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r2s7q" Apr 17 16:33:56.674653 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:56.674410 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-zh5cb\"" Apr 17 16:33:56.674724 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:56.674696 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 16:33:56.683577 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:56.683551 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r2s7q"] Apr 17 16:33:56.766508 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:56.766476 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c9a73140-1bc2-42f0-be27-699cd3ace384-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-r2s7q\" (UID: \"c9a73140-1bc2-42f0-be27-699cd3ace384\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r2s7q" Apr 17 16:33:56.867113 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:56.867088 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c9a73140-1bc2-42f0-be27-699cd3ace384-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-r2s7q\" (UID: \"c9a73140-1bc2-42f0-be27-699cd3ace384\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r2s7q" Apr 17 16:33:56.869617 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:56.869596 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c9a73140-1bc2-42f0-be27-699cd3ace384-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-r2s7q\" (UID: \"c9a73140-1bc2-42f0-be27-699cd3ace384\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r2s7q" Apr 17 16:33:56.977563 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:56.977507 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r2s7q" Apr 17 16:33:57.089093 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:57.089050 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r2s7q"] Apr 17 16:33:57.092303 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:33:57.092278 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9a73140_1bc2_42f0_be27_699cd3ace384.slice/crio-222bfd96ee497ac2a65d8a5f0e287bb486c9605eb80b70b6105a502a1c2aa5b2 WatchSource:0}: Error finding container 222bfd96ee497ac2a65d8a5f0e287bb486c9605eb80b70b6105a502a1c2aa5b2: Status 404 returned error can't find the container with id 222bfd96ee497ac2a65d8a5f0e287bb486c9605eb80b70b6105a502a1c2aa5b2 Apr 17 16:33:57.717854 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:57.717816 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r2s7q" event={"ID":"c9a73140-1bc2-42f0-be27-699cd3ace384","Type":"ContainerStarted","Data":"222bfd96ee497ac2a65d8a5f0e287bb486c9605eb80b70b6105a502a1c2aa5b2"} Apr 17 16:33:58.721818 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:58.721785 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r2s7q" event={"ID":"c9a73140-1bc2-42f0-be27-699cd3ace384","Type":"ContainerStarted","Data":"a26965fc6c186ada8120c3fbf567c52e46df2086d5ee7d2648ca805366aee707"} Apr 17 16:33:58.722191 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:58.721999 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r2s7q" Apr 17 16:33:58.726441 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:58.726422 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r2s7q" Apr 17 16:33:58.738636 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:58.738595 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r2s7q" podStartSLOduration=1.763421141 podStartE2EDuration="2.73858331s" podCreationTimestamp="2026-04-17 16:33:56 +0000 UTC" firstStartedPulling="2026-04-17 16:33:57.09414546 +0000 UTC m=+148.615722402" lastFinishedPulling="2026-04-17 16:33:58.06930763 +0000 UTC m=+149.590884571" observedRunningTime="2026-04-17 16:33:58.737711153 +0000 UTC m=+150.259288114" watchObservedRunningTime="2026-04-17 16:33:58.73858331 +0000 UTC m=+150.260160296" Apr 17 16:33:59.736255 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.736224 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jz2ql"] Apr 17 16:33:59.739098 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.739060 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" Apr 17 16:33:59.742020 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.741998 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 16:33:59.742020 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.742019 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 16:33:59.742192 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.742037 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 16:33:59.742430 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.742000 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-px4dh\"" Apr 17 16:33:59.742700 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.742000 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 16:33:59.743126 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.743102 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 16:33:59.749398 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.749372 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jz2ql"] Apr 17 16:33:59.891786 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.891754 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e90eeafd-bb31-4b7a-a3ec-9103e9a76283-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jz2ql\" (UID: \"e90eeafd-bb31-4b7a-a3ec-9103e9a76283\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" Apr 17 16:33:59.891931 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.891806 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e90eeafd-bb31-4b7a-a3ec-9103e9a76283-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jz2ql\" (UID: \"e90eeafd-bb31-4b7a-a3ec-9103e9a76283\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" Apr 17 16:33:59.891931 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.891893 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90eeafd-bb31-4b7a-a3ec-9103e9a76283-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jz2ql\" (UID: \"e90eeafd-bb31-4b7a-a3ec-9103e9a76283\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" Apr 17 16:33:59.891931 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.891922 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vltrn\" (UniqueName: \"kubernetes.io/projected/e90eeafd-bb31-4b7a-a3ec-9103e9a76283-kube-api-access-vltrn\") pod \"prometheus-operator-5676c8c784-jz2ql\" (UID: \"e90eeafd-bb31-4b7a-a3ec-9103e9a76283\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" Apr 17 16:33:59.992996 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.992905 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vltrn\" (UniqueName: \"kubernetes.io/projected/e90eeafd-bb31-4b7a-a3ec-9103e9a76283-kube-api-access-vltrn\") pod \"prometheus-operator-5676c8c784-jz2ql\" (UID: \"e90eeafd-bb31-4b7a-a3ec-9103e9a76283\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" Apr 17 16:33:59.992996 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.992977 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e90eeafd-bb31-4b7a-a3ec-9103e9a76283-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jz2ql\" (UID: \"e90eeafd-bb31-4b7a-a3ec-9103e9a76283\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" Apr 17 16:33:59.993211 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.993026 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e90eeafd-bb31-4b7a-a3ec-9103e9a76283-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jz2ql\" (UID: \"e90eeafd-bb31-4b7a-a3ec-9103e9a76283\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" Apr 17 16:33:59.993211 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.993056 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90eeafd-bb31-4b7a-a3ec-9103e9a76283-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jz2ql\" (UID: \"e90eeafd-bb31-4b7a-a3ec-9103e9a76283\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" Apr 17 16:33:59.993211 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:59.993152 2578 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 16:33:59.993211 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:33:59.993198 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90eeafd-bb31-4b7a-a3ec-9103e9a76283-prometheus-operator-tls podName:e90eeafd-bb31-4b7a-a3ec-9103e9a76283 nodeName:}" failed. No retries permitted until 2026-04-17 16:34:00.493185498 +0000 UTC m=+152.014762430 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/e90eeafd-bb31-4b7a-a3ec-9103e9a76283-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-jz2ql" (UID: "e90eeafd-bb31-4b7a-a3ec-9103e9a76283") : secret "prometheus-operator-tls" not found Apr 17 16:33:59.993714 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.993695 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e90eeafd-bb31-4b7a-a3ec-9103e9a76283-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jz2ql\" (UID: \"e90eeafd-bb31-4b7a-a3ec-9103e9a76283\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" Apr 17 16:33:59.995617 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:33:59.995595 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e90eeafd-bb31-4b7a-a3ec-9103e9a76283-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jz2ql\" (UID: \"e90eeafd-bb31-4b7a-a3ec-9103e9a76283\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" Apr 17 16:34:00.001817 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:00.001797 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vltrn\" (UniqueName: \"kubernetes.io/projected/e90eeafd-bb31-4b7a-a3ec-9103e9a76283-kube-api-access-vltrn\") pod \"prometheus-operator-5676c8c784-jz2ql\" (UID: \"e90eeafd-bb31-4b7a-a3ec-9103e9a76283\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" Apr 17 16:34:00.497399 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:00.497372 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90eeafd-bb31-4b7a-a3ec-9103e9a76283-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jz2ql\" (UID: \"e90eeafd-bb31-4b7a-a3ec-9103e9a76283\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" Apr 17 16:34:00.499806 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:00.499782 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90eeafd-bb31-4b7a-a3ec-9103e9a76283-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jz2ql\" (UID: \"e90eeafd-bb31-4b7a-a3ec-9103e9a76283\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" Apr 17 16:34:00.649460 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:00.649425 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" Apr 17 16:34:00.770138 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:00.770059 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jz2ql"] Apr 17 16:34:00.773133 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:34:00.773106 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode90eeafd_bb31_4b7a_a3ec_9103e9a76283.slice/crio-8d175c10fbb7f5beba52c7c33b0d2ed5c8714814390d75ba3af4d81df1467a86 WatchSource:0}: Error finding container 8d175c10fbb7f5beba52c7c33b0d2ed5c8714814390d75ba3af4d81df1467a86: Status 404 returned error can't find the container with id 8d175c10fbb7f5beba52c7c33b0d2ed5c8714814390d75ba3af4d81df1467a86 Apr 17 16:34:01.730643 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:01.730606 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" event={"ID":"e90eeafd-bb31-4b7a-a3ec-9103e9a76283","Type":"ContainerStarted","Data":"8d175c10fbb7f5beba52c7c33b0d2ed5c8714814390d75ba3af4d81df1467a86"} Apr 17 16:34:02.734648 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:02.734616 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" event={"ID":"e90eeafd-bb31-4b7a-a3ec-9103e9a76283","Type":"ContainerStarted","Data":"02b0c98f9b6b3d64f5171dc6d24b7b161b2243a5f80262e3360a78dd3404e0f2"} Apr 17 16:34:02.734648 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:02.734652 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" event={"ID":"e90eeafd-bb31-4b7a-a3ec-9103e9a76283","Type":"ContainerStarted","Data":"020191f449bc7cc63eead2d1ad23267cb18a3d289ce3eea6129da144135323c0"} Apr 17 16:34:02.751676 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:02.751630 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-jz2ql" podStartSLOduration=2.495792312 podStartE2EDuration="3.751616942s" podCreationTimestamp="2026-04-17 16:33:59 +0000 UTC" firstStartedPulling="2026-04-17 16:34:00.774814069 +0000 UTC m=+152.296391000" lastFinishedPulling="2026-04-17 16:34:02.030638686 +0000 UTC m=+153.552215630" observedRunningTime="2026-04-17 16:34:02.750622531 +0000 UTC m=+154.272199480" watchObservedRunningTime="2026-04-17 16:34:02.751616942 +0000 UTC m=+154.273193892" Apr 17 16:34:04.165793 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:04.165766 2578 scope.go:117] "RemoveContainer" containerID="7754fafefd0dfb55497fc3fa422fba5d0ad123f7a0a47e16c171f76cd46b78e8" Apr 17 16:34:04.166159 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:34:04.165931 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-t7k46_openshift-console-operator(8989b18c-2718-4e13-895b-5944e510a981)\"" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" podUID="8989b18c-2718-4e13-895b-5944e510a981" Apr 17 16:34:04.335753 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:34:04.335711 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-ghkgl" podUID="c56755ae-c685-4cd5-a21d-9b2df9f5189f" Apr 17 16:34:04.340848 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:34:04.340824 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-lfzcd" podUID="5eb99a8d-95ed-4e6b-8181-59a683f03f29" Apr 17 16:34:04.740030 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:04.740000 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lfzcd" Apr 17 16:34:04.740221 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:04.740013 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ghkgl" Apr 17 16:34:05.133170 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.133140 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qjzz4"] Apr 17 16:34:05.136480 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.136462 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.139773 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.139753 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-p22nr\"" Apr 17 16:34:05.139888 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.139873 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 16:34:05.140196 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.140179 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 16:34:05.140566 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.140546 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 16:34:05.180346 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:34:05.180323 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-598xw" podUID="a6f8630a-c602-4066-a1c1-66f602f947fc" Apr 17 16:34:05.231575 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.231553 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-wtmp\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.231685 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.231597 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgx88\" (UniqueName: \"kubernetes.io/projected/1731b992-77af-4172-8c09-e0f9502982e1-kube-api-access-kgx88\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.231685 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.231614 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-accelerators-collector-config\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.231795 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.231707 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-tls\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.231795 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.231746 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1731b992-77af-4172-8c09-e0f9502982e1-root\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.231795 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.231776 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1731b992-77af-4172-8c09-e0f9502982e1-metrics-client-ca\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.231937 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.231802 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-textfile\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.231937 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.231837 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.231937 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.231866 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1731b992-77af-4172-8c09-e0f9502982e1-sys\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.332723 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.332697 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-textfile\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.332826 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.332738 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.332826 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.332766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1731b992-77af-4172-8c09-e0f9502982e1-sys\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.332826 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.332816 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-wtmp\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.332993 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.332871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgx88\" (UniqueName: \"kubernetes.io/projected/1731b992-77af-4172-8c09-e0f9502982e1-kube-api-access-kgx88\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.332993 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.332892 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1731b992-77af-4172-8c09-e0f9502982e1-sys\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.332993 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.332899 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-accelerators-collector-config\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.333172 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.332975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-tls\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.333172 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.333022 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-textfile\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.333172 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.333028 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1731b992-77af-4172-8c09-e0f9502982e1-root\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.333172 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.333041 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-wtmp\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.333172 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.333056 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1731b992-77af-4172-8c09-e0f9502982e1-metrics-client-ca\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.333172 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.333113 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1731b992-77af-4172-8c09-e0f9502982e1-root\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.333172 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:34:05.333141 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 16:34:05.333494 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:34:05.333193 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-tls podName:1731b992-77af-4172-8c09-e0f9502982e1 nodeName:}" failed. No retries permitted until 2026-04-17 16:34:05.833178714 +0000 UTC m=+157.354755642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-tls") pod "node-exporter-qjzz4" (UID: "1731b992-77af-4172-8c09-e0f9502982e1") : secret "node-exporter-tls" not found Apr 17 16:34:05.333554 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.333496 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-accelerators-collector-config\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.333703 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.333681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1731b992-77af-4172-8c09-e0f9502982e1-metrics-client-ca\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.335223 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.335208 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.347865 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.347846 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgx88\" (UniqueName: \"kubernetes.io/projected/1731b992-77af-4172-8c09-e0f9502982e1-kube-api-access-kgx88\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.837233 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.837195 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-tls\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:05.839544 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:05.839514 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1731b992-77af-4172-8c09-e0f9502982e1-node-exporter-tls\") pod \"node-exporter-qjzz4\" (UID: \"1731b992-77af-4172-8c09-e0f9502982e1\") " pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:06.045772 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.045744 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qjzz4" Apr 17 16:34:06.053987 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:34:06.053959 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1731b992_77af_4172_8c09_e0f9502982e1.slice/crio-1fccdb79f12223002bad009e2458c97068f7ee62996468e8c66e25211b60d374 WatchSource:0}: Error finding container 1fccdb79f12223002bad009e2458c97068f7ee62996468e8c66e25211b60d374: Status 404 returned error can't find the container with id 1fccdb79f12223002bad009e2458c97068f7ee62996468e8c66e25211b60d374 Apr 17 16:34:06.280390 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.280355 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:34:06.284974 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.284958 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.288316 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.288295 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 16:34:06.288414 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.288296 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 16:34:06.288577 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.288565 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-mscdk\"" Apr 17 16:34:06.289039 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.289018 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 16:34:06.289129 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.289115 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 16:34:06.290950 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.290879 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 16:34:06.290950 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.290917 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 16:34:06.290950 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.290936 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 16:34:06.291347 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.291330 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 16:34:06.297133 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.297092 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 16:34:06.313251 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.313231 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:34:06.340308 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.340277 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpcj6\" (UniqueName: \"kubernetes.io/projected/bb032b79-4a45-4271-a159-a451a0c232a7-kube-api-access-xpcj6\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.340384 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.340311 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.340384 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.340352 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bb032b79-4a45-4271-a159-a451a0c232a7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.340460 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.340383 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.340460 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.340403 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb032b79-4a45-4271-a159-a451a0c232a7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.340519 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.340454 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.340519 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.340481 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-web-config\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.340519 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.340501 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb032b79-4a45-4271-a159-a451a0c232a7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.340618 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.340519 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.340618 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.340556 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.340618 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.340576 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb032b79-4a45-4271-a159-a451a0c232a7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.340618 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.340600 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb032b79-4a45-4271-a159-a451a0c232a7-config-out\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.340727 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.340636 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-config-volume\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.441507 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.441479 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bb032b79-4a45-4271-a159-a451a0c232a7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.441611 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.441514 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.441611 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.441541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb032b79-4a45-4271-a159-a451a0c232a7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.441611 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.441571 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.441611 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.441599 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-web-config\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.441796 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.441626 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb032b79-4a45-4271-a159-a451a0c232a7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.441796 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.441651 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.441796 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.441679 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.441796 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.441710 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb032b79-4a45-4271-a159-a451a0c232a7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.441796 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.441750 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb032b79-4a45-4271-a159-a451a0c232a7-config-out\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.441990 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.441796 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-config-volume\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.441990 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.441824 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpcj6\" (UniqueName: \"kubernetes.io/projected/bb032b79-4a45-4271-a159-a451a0c232a7-kube-api-access-xpcj6\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.441990 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.441862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.441990 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.441932 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bb032b79-4a45-4271-a159-a451a0c232a7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.442411 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.442390 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb032b79-4a45-4271-a159-a451a0c232a7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.444109 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.444059 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb032b79-4a45-4271-a159-a451a0c232a7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.445365 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.445327 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-config-volume\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.445365 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.445349 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.445541 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.445523 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.445607 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.445557 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.446156 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.445857 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.446156 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.445978 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.446438 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.446417 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-web-config\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.446715 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.446697 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb032b79-4a45-4271-a159-a451a0c232a7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.446715 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.446706 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb032b79-4a45-4271-a159-a451a0c232a7-config-out\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.450883 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.450865 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpcj6\" (UniqueName: \"kubernetes.io/projected/bb032b79-4a45-4271-a159-a451a0c232a7-kube-api-access-xpcj6\") pod \"alertmanager-main-0\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.595118 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.595038 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:06.738041 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.738016 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:34:06.745707 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:06.745678 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qjzz4" event={"ID":"1731b992-77af-4172-8c09-e0f9502982e1","Type":"ContainerStarted","Data":"1fccdb79f12223002bad009e2458c97068f7ee62996468e8c66e25211b60d374"} Apr 17 16:34:06.805131 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:34:06.805096 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb032b79_4a45_4271_a159_a451a0c232a7.slice/crio-c7d927e03b8e5e21dffc10fa23d550d69539fd8e3a69884d1c381e8005b1da45 WatchSource:0}: Error finding container c7d927e03b8e5e21dffc10fa23d550d69539fd8e3a69884d1c381e8005b1da45: Status 404 returned error can't find the container with id c7d927e03b8e5e21dffc10fa23d550d69539fd8e3a69884d1c381e8005b1da45 Apr 17 16:34:07.115007 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.114948 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-67546d9545-sppsg"] Apr 17 16:34:07.118216 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.118201 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.120875 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.120854 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 16:34:07.120875 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.120862 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 16:34:07.121021 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.120868 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 16:34:07.121021 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.120862 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 16:34:07.121021 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.120955 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 16:34:07.121165 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.121150 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-4ldgatlbv959b\"" Apr 17 16:34:07.121261 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.121247 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-mfnkf\"" Apr 17 16:34:07.130444 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.130424 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-67546d9545-sppsg"] Apr 17 16:34:07.147582 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.147560 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.147660 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.147589 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.147660 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.147611 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f448m\" (UniqueName: \"kubernetes.io/projected/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-kube-api-access-f448m\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.147660 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.147630 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-grpc-tls\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.147784 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.147686 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.147784 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.147718 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.147784 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.147757 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-thanos-querier-tls\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.147784 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.147776 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-metrics-client-ca\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.248296 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.248274 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-grpc-tls\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.248409 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.248315 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.248409 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.248341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.248409 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.248382 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-thanos-querier-tls\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.248563 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.248421 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-metrics-client-ca\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.248563 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.248526 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.248657 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.248559 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.248657 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.248587 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f448m\" (UniqueName: \"kubernetes.io/projected/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-kube-api-access-f448m\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.249475 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.249430 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-metrics-client-ca\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.251044 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.251001 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-thanos-querier-tls\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.251202 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.251181 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.251441 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.251422 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-grpc-tls\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.251538 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.251510 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.251577 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.251514 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.252001 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.251985 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.257441 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.257425 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f448m\" (UniqueName: \"kubernetes.io/projected/d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0-kube-api-access-f448m\") pod \"thanos-querier-67546d9545-sppsg\" (UID: \"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0\") " pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.427676 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.427591 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:07.592435 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.592406 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-67546d9545-sppsg"] Apr 17 16:34:07.750236 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.750153 2578 generic.go:358] "Generic (PLEG): container finished" podID="1731b992-77af-4172-8c09-e0f9502982e1" containerID="33e39c5ddfc104b1e9dd7c0f645ce4c1829605c8bc6bf4470787f66a2316af9d" exitCode=0 Apr 17 16:34:07.750236 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.750209 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qjzz4" event={"ID":"1731b992-77af-4172-8c09-e0f9502982e1","Type":"ContainerDied","Data":"33e39c5ddfc104b1e9dd7c0f645ce4c1829605c8bc6bf4470787f66a2316af9d"} Apr 17 16:34:07.751420 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:07.751385 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb032b79-4a45-4271-a159-a451a0c232a7","Type":"ContainerStarted","Data":"c7d927e03b8e5e21dffc10fa23d550d69539fd8e3a69884d1c381e8005b1da45"} Apr 17 16:34:07.777095 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:34:07.777039 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0e8f0d0_9cf7_484f_8943_5638ec9dfcc0.slice/crio-2697ed28807124e4d950541153a808e2e889c04705fae3273608ec5a48320a4b WatchSource:0}: Error finding container 2697ed28807124e4d950541153a808e2e889c04705fae3273608ec5a48320a4b: Status 404 returned error can't find the container with id 2697ed28807124e4d950541153a808e2e889c04705fae3273608ec5a48320a4b Apr 17 16:34:08.755437 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:08.755402 2578 generic.go:358] "Generic (PLEG): container finished" podID="bb032b79-4a45-4271-a159-a451a0c232a7" containerID="c214ef817fa7251d3520c374792e0ff7f985a74cc6262d413079355e44d5f5cc" exitCode=0 Apr 17 16:34:08.755835 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:08.755469 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb032b79-4a45-4271-a159-a451a0c232a7","Type":"ContainerDied","Data":"c214ef817fa7251d3520c374792e0ff7f985a74cc6262d413079355e44d5f5cc"} Apr 17 16:34:08.756631 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:08.756513 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" event={"ID":"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0","Type":"ContainerStarted","Data":"2697ed28807124e4d950541153a808e2e889c04705fae3273608ec5a48320a4b"} Apr 17 16:34:08.758208 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:08.758183 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qjzz4" event={"ID":"1731b992-77af-4172-8c09-e0f9502982e1","Type":"ContainerStarted","Data":"dfe3e501d81360b4f6ee64ce9982a633fba700a4c6a0ff4d47d88a38f34f88ce"} Apr 17 16:34:08.758302 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:08.758216 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qjzz4" event={"ID":"1731b992-77af-4172-8c09-e0f9502982e1","Type":"ContainerStarted","Data":"f96848a17b929c18f5b3818ada524d187789e610f779706f5a670ad74de985b1"} Apr 17 16:34:08.825801 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:08.825757 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qjzz4" podStartSLOduration=3.028115291 podStartE2EDuration="3.82574315s" podCreationTimestamp="2026-04-17 16:34:05 +0000 UTC" firstStartedPulling="2026-04-17 16:34:06.055662916 +0000 UTC m=+157.577239843" lastFinishedPulling="2026-04-17 16:34:06.853290771 +0000 UTC m=+158.374867702" observedRunningTime="2026-04-17 16:34:08.824765375 +0000 UTC m=+160.346342348" watchObservedRunningTime="2026-04-17 16:34:08.82574315 +0000 UTC m=+160.347320100" Apr 17 16:34:09.267782 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:09.267749 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:34:09.268026 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:09.267810 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert\") pod \"ingress-canary-lfzcd\" (UID: \"5eb99a8d-95ed-4e6b-8181-59a683f03f29\") " pod="openshift-ingress-canary/ingress-canary-lfzcd" Apr 17 16:34:09.270434 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:09.270404 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c56755ae-c685-4cd5-a21d-9b2df9f5189f-metrics-tls\") pod \"dns-default-ghkgl\" (UID: \"c56755ae-c685-4cd5-a21d-9b2df9f5189f\") " pod="openshift-dns/dns-default-ghkgl" Apr 17 16:34:09.270736 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:09.270719 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eb99a8d-95ed-4e6b-8181-59a683f03f29-cert\") pod \"ingress-canary-lfzcd\" (UID: \"5eb99a8d-95ed-4e6b-8181-59a683f03f29\") " pod="openshift-ingress-canary/ingress-canary-lfzcd" Apr 17 16:34:09.543455 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:09.543420 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dbwrt\"" Apr 17 16:34:09.543924 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:09.543903 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ttskh\"" Apr 17 16:34:09.551198 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:09.551163 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ghkgl" Apr 17 16:34:09.551337 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:09.551236 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lfzcd" Apr 17 16:34:09.727632 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:09.727584 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lfzcd"] Apr 17 16:34:09.729927 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:34:09.729903 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eb99a8d_95ed_4e6b_8181_59a683f03f29.slice/crio-9fb41e83e00a8a195dd63fcae5a975da16a773fc209ad8c98d836eb88f88cbab WatchSource:0}: Error finding container 9fb41e83e00a8a195dd63fcae5a975da16a773fc209ad8c98d836eb88f88cbab: Status 404 returned error can't find the container with id 9fb41e83e00a8a195dd63fcae5a975da16a773fc209ad8c98d836eb88f88cbab Apr 17 16:34:09.749045 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:09.749007 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ghkgl"] Apr 17 16:34:09.764082 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:09.764040 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lfzcd" event={"ID":"5eb99a8d-95ed-4e6b-8181-59a683f03f29","Type":"ContainerStarted","Data":"9fb41e83e00a8a195dd63fcae5a975da16a773fc209ad8c98d836eb88f88cbab"} Apr 17 16:34:09.765956 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:09.765917 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" event={"ID":"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0","Type":"ContainerStarted","Data":"28b4648d08cbd6ef008c0b5b4158716a893a9441a0a4c788ab835aae656bb8ce"} Apr 17 16:34:10.771035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:10.770999 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" event={"ID":"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0","Type":"ContainerStarted","Data":"02d66655a6308bf51055d5e8b861b6c9b96e5f17402da218a5a78037fd75a3d7"} Apr 17 16:34:10.771035 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:10.771039 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" event={"ID":"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0","Type":"ContainerStarted","Data":"1d325042f39535cd99af564cfcd02809931269844502c4d95dbfa877afb8b7cd"} Apr 17 16:34:10.772586 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:10.772532 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ghkgl" event={"ID":"c56755ae-c685-4cd5-a21d-9b2df9f5189f","Type":"ContainerStarted","Data":"7719101e203e42dbd21f26dbdc7490e826498c21862ac945b15abba60fb2c671"} Apr 17 16:34:11.377334 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.377300 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:34:11.385603 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.385574 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.388221 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.388196 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 16:34:11.388363 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.388196 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 16:34:11.388504 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.388489 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 16:34:11.388565 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.388537 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 16:34:11.388783 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.388761 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 16:34:11.389086 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.389041 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 16:34:11.389160 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.389141 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 16:34:11.389299 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.389283 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-8f9dt\"" Apr 17 16:34:11.389576 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.389560 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 16:34:11.389657 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.389575 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 16:34:11.390365 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.390177 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 16:34:11.390365 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.390212 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5a2bpae6g8khu\"" Apr 17 16:34:11.390365 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.390253 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 16:34:11.390365 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.390221 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 16:34:11.394049 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.394014 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 16:34:11.401051 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.399792 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:34:11.488575 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.488531 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/47c80c4d-75d7-4e1b-82d1-c22be91802c1-config-out\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.488758 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.488590 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmrcd\" (UniqueName: \"kubernetes.io/projected/47c80c4d-75d7-4e1b-82d1-c22be91802c1-kube-api-access-bmrcd\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.488758 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.488627 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-web-config\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.488758 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.488648 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-config\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.488758 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.488677 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.488758 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.488730 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.488758 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.488752 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.489100 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.488778 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.489100 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.488835 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.489100 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.488880 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.489100 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.488909 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.489100 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.488943 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.489100 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.488975 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.489100 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.488999 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.489100 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.489024 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.489100 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.489054 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/47c80c4d-75d7-4e1b-82d1-c22be91802c1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.489100 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.489098 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.489522 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.489176 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.589858 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.589809 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.590039 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.589887 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.590039 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.589921 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.590039 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.589956 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.590039 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.589987 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.590039 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.590011 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.590039 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.590037 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.590373 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.590089 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/47c80c4d-75d7-4e1b-82d1-c22be91802c1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.590373 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.590117 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.590373 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.590158 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.590373 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.590200 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/47c80c4d-75d7-4e1b-82d1-c22be91802c1-config-out\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.590373 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.590239 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmrcd\" (UniqueName: \"kubernetes.io/projected/47c80c4d-75d7-4e1b-82d1-c22be91802c1-kube-api-access-bmrcd\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.590373 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.590276 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-web-config\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.590373 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.590294 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-config\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.590373 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.590315 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.590746 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.590632 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.591543 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.591509 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.592652 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.592179 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.594093 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.593961 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.594414 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.594244 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.594414 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.594298 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.594414 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.594329 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.596176 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.595423 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.596176 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.595871 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.596176 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.596133 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.596371 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.594396 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.597307 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.597266 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/47c80c4d-75d7-4e1b-82d1-c22be91802c1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.597517 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.597349 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.599338 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.599296 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/47c80c4d-75d7-4e1b-82d1-c22be91802c1-config-out\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.599564 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.599537 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.599943 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.599917 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.600027 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.599991 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-web-config\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.600121 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.600020 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.600121 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.600097 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.601521 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.601500 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-config\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.606842 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.606811 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmrcd\" (UniqueName: \"kubernetes.io/projected/47c80c4d-75d7-4e1b-82d1-c22be91802c1-kube-api-access-bmrcd\") pod \"prometheus-k8s-0\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.700727 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:11.700645 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:12.153555 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.153497 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:34:12.782044 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.782011 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ghkgl" event={"ID":"c56755ae-c685-4cd5-a21d-9b2df9f5189f","Type":"ContainerStarted","Data":"7ca25a333aa0f9c8ed63c5ac5e4c5355e0bc4c01a2728333732c414f68edeaa3"} Apr 17 16:34:12.782044 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.782046 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ghkgl" event={"ID":"c56755ae-c685-4cd5-a21d-9b2df9f5189f","Type":"ContainerStarted","Data":"5196fa73bd41d78202a1459ca00034979337bb94135ec2ba36a4c20bcb82d152"} Apr 17 16:34:12.783390 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.783365 2578 generic.go:358] "Generic (PLEG): container finished" podID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerID="86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb" exitCode=0 Apr 17 16:34:12.783485 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.783455 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"47c80c4d-75d7-4e1b-82d1-c22be91802c1","Type":"ContainerDied","Data":"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb"} Apr 17 16:34:12.783545 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.783486 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"47c80c4d-75d7-4e1b-82d1-c22be91802c1","Type":"ContainerStarted","Data":"2ed79b82a6205dd031ab074d5c8ae7d8d9912752f1a4ce35cbe9e5fd773e6906"} Apr 17 16:34:12.784760 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.784724 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lfzcd" event={"ID":"5eb99a8d-95ed-4e6b-8181-59a683f03f29","Type":"ContainerStarted","Data":"ec759cbc5aa967ab2b5a99c30068eeebdf7e7dbec4744cca0294a535ff91f223"} Apr 17 16:34:12.787708 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.787688 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb032b79-4a45-4271-a159-a451a0c232a7","Type":"ContainerStarted","Data":"9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1"} Apr 17 16:34:12.787794 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.787715 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb032b79-4a45-4271-a159-a451a0c232a7","Type":"ContainerStarted","Data":"d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6"} Apr 17 16:34:12.787794 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.787730 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb032b79-4a45-4271-a159-a451a0c232a7","Type":"ContainerStarted","Data":"ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0"} Apr 17 16:34:12.787794 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.787741 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb032b79-4a45-4271-a159-a451a0c232a7","Type":"ContainerStarted","Data":"3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b"} Apr 17 16:34:12.787794 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.787754 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb032b79-4a45-4271-a159-a451a0c232a7","Type":"ContainerStarted","Data":"545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b"} Apr 17 16:34:12.787794 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.787765 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb032b79-4a45-4271-a159-a451a0c232a7","Type":"ContainerStarted","Data":"075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b"} Apr 17 16:34:12.790097 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.790077 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" event={"ID":"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0","Type":"ContainerStarted","Data":"7f6bd27e1dab4304930f8a01f97b53a327a9f5ae7036334b54ae6ee97a30a9fe"} Apr 17 16:34:12.790185 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.790100 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" event={"ID":"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0","Type":"ContainerStarted","Data":"7c91b9cbdcf76d16ce9b73d937c2beba69cc257ce09ee1fcb530ea042ceedbec"} Apr 17 16:34:12.790185 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.790109 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" event={"ID":"d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0","Type":"ContainerStarted","Data":"30eb8a72815a680fe03b4afbe3f032d7e1d618189c0a85269025a77235c71ad0"} Apr 17 16:34:12.790294 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.790281 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:12.805038 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.804999 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ghkgl" podStartSLOduration=129.599254314 podStartE2EDuration="2m11.804986783s" podCreationTimestamp="2026-04-17 16:32:01 +0000 UTC" firstStartedPulling="2026-04-17 16:34:09.763634882 +0000 UTC m=+161.285211815" lastFinishedPulling="2026-04-17 16:34:11.969367342 +0000 UTC m=+163.490944284" observedRunningTime="2026-04-17 16:34:12.803798101 +0000 UTC m=+164.325375045" watchObservedRunningTime="2026-04-17 16:34:12.804986783 +0000 UTC m=+164.326563732" Apr 17 16:34:12.824562 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.824529 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-lfzcd" podStartSLOduration=129.58608266 podStartE2EDuration="2m11.824521601s" podCreationTimestamp="2026-04-17 16:32:01 +0000 UTC" firstStartedPulling="2026-04-17 16:34:09.733726424 +0000 UTC m=+161.255303351" lastFinishedPulling="2026-04-17 16:34:11.97216535 +0000 UTC m=+163.493742292" observedRunningTime="2026-04-17 16:34:12.824053022 +0000 UTC m=+164.345629971" watchObservedRunningTime="2026-04-17 16:34:12.824521601 +0000 UTC m=+164.346098560" Apr 17 16:34:12.897621 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.897577 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.735350504 podStartE2EDuration="6.897565025s" podCreationTimestamp="2026-04-17 16:34:06 +0000 UTC" firstStartedPulling="2026-04-17 16:34:06.807131736 +0000 UTC m=+158.328708664" lastFinishedPulling="2026-04-17 16:34:11.969346256 +0000 UTC m=+163.490923185" observedRunningTime="2026-04-17 16:34:12.895968731 +0000 UTC m=+164.417545704" watchObservedRunningTime="2026-04-17 16:34:12.897565025 +0000 UTC m=+164.419141974" Apr 17 16:34:12.920112 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:12.920041 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" podStartSLOduration=1.7295826760000002 podStartE2EDuration="5.920029435s" podCreationTimestamp="2026-04-17 16:34:07 +0000 UTC" firstStartedPulling="2026-04-17 16:34:07.778900174 +0000 UTC m=+159.300477105" lastFinishedPulling="2026-04-17 16:34:11.969346934 +0000 UTC m=+163.490923864" observedRunningTime="2026-04-17 16:34:12.918966067 +0000 UTC m=+164.440543040" watchObservedRunningTime="2026-04-17 16:34:12.920029435 +0000 UTC m=+164.441606766" Apr 17 16:34:13.794404 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:13.794372 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ghkgl" Apr 17 16:34:15.807410 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:15.807381 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"47c80c4d-75d7-4e1b-82d1-c22be91802c1","Type":"ContainerStarted","Data":"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9"} Apr 17 16:34:15.807410 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:15.807417 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"47c80c4d-75d7-4e1b-82d1-c22be91802c1","Type":"ContainerStarted","Data":"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763"} Apr 17 16:34:15.807860 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:15.807426 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"47c80c4d-75d7-4e1b-82d1-c22be91802c1","Type":"ContainerStarted","Data":"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa"} Apr 17 16:34:15.807860 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:15.807434 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"47c80c4d-75d7-4e1b-82d1-c22be91802c1","Type":"ContainerStarted","Data":"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa"} Apr 17 16:34:15.807860 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:15.807442 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"47c80c4d-75d7-4e1b-82d1-c22be91802c1","Type":"ContainerStarted","Data":"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf"} Apr 17 16:34:15.807860 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:15.807449 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"47c80c4d-75d7-4e1b-82d1-c22be91802c1","Type":"ContainerStarted","Data":"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e"} Apr 17 16:34:15.844418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:15.844372 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.353284573 podStartE2EDuration="4.844357879s" podCreationTimestamp="2026-04-17 16:34:11 +0000 UTC" firstStartedPulling="2026-04-17 16:34:12.784556305 +0000 UTC m=+164.306133233" lastFinishedPulling="2026-04-17 16:34:15.275629608 +0000 UTC m=+166.797206539" observedRunningTime="2026-04-17 16:34:15.842128444 +0000 UTC m=+167.363705393" watchObservedRunningTime="2026-04-17 16:34:15.844357879 +0000 UTC m=+167.365934828" Apr 17 16:34:16.701864 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:16.701819 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:17.165916 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:17.165889 2578 scope.go:117] "RemoveContainer" containerID="7754fafefd0dfb55497fc3fa422fba5d0ad123f7a0a47e16c171f76cd46b78e8" Apr 17 16:34:17.815842 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:17.815815 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 16:34:17.816013 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:17.815939 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" event={"ID":"8989b18c-2718-4e13-895b-5944e510a981","Type":"ContainerStarted","Data":"89172b021836f7aca3baf0c5f0f655245662e9f105e1c00dffa3a4ce280855be"} Apr 17 16:34:17.816348 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:17.816318 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:34:17.820947 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:17.820928 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" Apr 17 16:34:17.833606 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:17.833564 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-t7k46" podStartSLOduration=54.551621275 podStartE2EDuration="56.833549882s" podCreationTimestamp="2026-04-17 16:33:21 +0000 UTC" firstStartedPulling="2026-04-17 16:33:22.331276791 +0000 UTC m=+113.852853720" lastFinishedPulling="2026-04-17 16:33:24.613205396 +0000 UTC m=+116.134782327" observedRunningTime="2026-04-17 16:34:17.832911227 +0000 UTC m=+169.354488176" watchObservedRunningTime="2026-04-17 16:34:17.833549882 +0000 UTC m=+169.355126845" Apr 17 16:34:18.800487 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:18.800462 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-67546d9545-sppsg" Apr 17 16:34:19.167459 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:19.167379 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:34:23.799149 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:23.799114 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ghkgl" Apr 17 16:34:50.918199 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:50.918164 2578 generic.go:358] "Generic (PLEG): container finished" podID="52c76994-eea6-40ad-81ff-21383f7c251b" containerID="a2dd5a0e3a22a67f3b66c8ac7d1448c3d1725e10cbff35a7dc4436bdcd4a78ef" exitCode=0 Apr 17 16:34:50.918610 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:50.918210 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zgh5t" event={"ID":"52c76994-eea6-40ad-81ff-21383f7c251b","Type":"ContainerDied","Data":"a2dd5a0e3a22a67f3b66c8ac7d1448c3d1725e10cbff35a7dc4436bdcd4a78ef"} Apr 17 16:34:50.918610 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:50.918533 2578 scope.go:117] "RemoveContainer" containerID="a2dd5a0e3a22a67f3b66c8ac7d1448c3d1725e10cbff35a7dc4436bdcd4a78ef" Apr 17 16:34:51.922974 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:34:51.922942 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zgh5t" event={"ID":"52c76994-eea6-40ad-81ff-21383f7c251b","Type":"ContainerStarted","Data":"ea52ce12d1358687010a66886d508864a8d65139362346c441957ca0fbe92bd6"} Apr 17 16:35:11.701822 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:11.701789 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:11.722155 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:11.722134 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:11.997277 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:11.997199 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:25.398158 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:25.398122 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:35:25.398632 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:25.398569 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="alertmanager" containerID="cri-o://075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b" gracePeriod=120 Apr 17 16:35:25.398704 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:25.398624 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="kube-rbac-proxy-metric" containerID="cri-o://d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6" gracePeriod=120 Apr 17 16:35:25.398704 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:25.398659 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="kube-rbac-proxy-web" containerID="cri-o://3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b" gracePeriod=120 Apr 17 16:35:25.398805 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:25.398683 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="kube-rbac-proxy" containerID="cri-o://ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0" gracePeriod=120 Apr 17 16:35:25.398805 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:25.398717 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="config-reloader" containerID="cri-o://545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b" gracePeriod=120 Apr 17 16:35:25.398805 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:25.398751 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="prom-label-proxy" containerID="cri-o://9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1" gracePeriod=120 Apr 17 16:35:26.024648 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.024612 2578 generic.go:358] "Generic (PLEG): container finished" podID="bb032b79-4a45-4271-a159-a451a0c232a7" containerID="9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1" exitCode=0 Apr 17 16:35:26.024648 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.024643 2578 generic.go:358] "Generic (PLEG): container finished" podID="bb032b79-4a45-4271-a159-a451a0c232a7" containerID="ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0" exitCode=0 Apr 17 16:35:26.024648 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.024651 2578 generic.go:358] "Generic (PLEG): container finished" podID="bb032b79-4a45-4271-a159-a451a0c232a7" containerID="545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b" exitCode=0 Apr 17 16:35:26.024648 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.024657 2578 generic.go:358] "Generic (PLEG): container finished" podID="bb032b79-4a45-4271-a159-a451a0c232a7" containerID="075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b" exitCode=0 Apr 17 16:35:26.024918 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.024684 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb032b79-4a45-4271-a159-a451a0c232a7","Type":"ContainerDied","Data":"9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1"} Apr 17 16:35:26.024918 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.024721 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb032b79-4a45-4271-a159-a451a0c232a7","Type":"ContainerDied","Data":"ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0"} Apr 17 16:35:26.024918 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.024738 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb032b79-4a45-4271-a159-a451a0c232a7","Type":"ContainerDied","Data":"545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b"} Apr 17 16:35:26.024918 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.024749 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb032b79-4a45-4271-a159-a451a0c232a7","Type":"ContainerDied","Data":"075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b"} Apr 17 16:35:26.643047 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.643018 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:26.804683 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.804656 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bb032b79-4a45-4271-a159-a451a0c232a7-alertmanager-main-db\") pod \"bb032b79-4a45-4271-a159-a451a0c232a7\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " Apr 17 16:35:26.804847 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.804689 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb032b79-4a45-4271-a159-a451a0c232a7-tls-assets\") pod \"bb032b79-4a45-4271-a159-a451a0c232a7\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " Apr 17 16:35:26.804847 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.804719 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-config-volume\") pod \"bb032b79-4a45-4271-a159-a451a0c232a7\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " Apr 17 16:35:26.804847 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.804759 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpcj6\" (UniqueName: \"kubernetes.io/projected/bb032b79-4a45-4271-a159-a451a0c232a7-kube-api-access-xpcj6\") pod \"bb032b79-4a45-4271-a159-a451a0c232a7\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " Apr 17 16:35:26.804847 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.804784 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy\") pod \"bb032b79-4a45-4271-a159-a451a0c232a7\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " Apr 17 16:35:26.804847 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.804825 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb032b79-4a45-4271-a159-a451a0c232a7-alertmanager-trusted-ca-bundle\") pod \"bb032b79-4a45-4271-a159-a451a0c232a7\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " Apr 17 16:35:26.805132 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.804868 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb032b79-4a45-4271-a159-a451a0c232a7-metrics-client-ca\") pod \"bb032b79-4a45-4271-a159-a451a0c232a7\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " Apr 17 16:35:26.805132 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.804901 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy-web\") pod \"bb032b79-4a45-4271-a159-a451a0c232a7\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " Apr 17 16:35:26.805132 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.804933 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"bb032b79-4a45-4271-a159-a451a0c232a7\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " Apr 17 16:35:26.805132 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.804962 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-cluster-tls-config\") pod \"bb032b79-4a45-4271-a159-a451a0c232a7\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " Apr 17 16:35:26.805132 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.804992 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb032b79-4a45-4271-a159-a451a0c232a7-config-out\") pod \"bb032b79-4a45-4271-a159-a451a0c232a7\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " Apr 17 16:35:26.805132 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.805090 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-main-tls\") pod \"bb032b79-4a45-4271-a159-a451a0c232a7\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " Apr 17 16:35:26.805132 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.805119 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-web-config\") pod \"bb032b79-4a45-4271-a159-a451a0c232a7\" (UID: \"bb032b79-4a45-4271-a159-a451a0c232a7\") " Apr 17 16:35:26.805652 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.805301 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb032b79-4a45-4271-a159-a451a0c232a7-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "bb032b79-4a45-4271-a159-a451a0c232a7" (UID: "bb032b79-4a45-4271-a159-a451a0c232a7"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:35:26.805652 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.805311 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb032b79-4a45-4271-a159-a451a0c232a7-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "bb032b79-4a45-4271-a159-a451a0c232a7" (UID: "bb032b79-4a45-4271-a159-a451a0c232a7"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:35:26.805652 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.805412 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb032b79-4a45-4271-a159-a451a0c232a7-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:26.805652 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.805432 2578 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb032b79-4a45-4271-a159-a451a0c232a7-metrics-client-ca\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:26.806264 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.806232 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb032b79-4a45-4271-a159-a451a0c232a7-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "bb032b79-4a45-4271-a159-a451a0c232a7" (UID: "bb032b79-4a45-4271-a159-a451a0c232a7"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:35:26.808306 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.808264 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb032b79-4a45-4271-a159-a451a0c232a7-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "bb032b79-4a45-4271-a159-a451a0c232a7" (UID: "bb032b79-4a45-4271-a159-a451a0c232a7"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:35:26.808571 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.808538 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "bb032b79-4a45-4271-a159-a451a0c232a7" (UID: "bb032b79-4a45-4271-a159-a451a0c232a7"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:26.808669 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.808588 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-config-volume" (OuterVolumeSpecName: "config-volume") pod "bb032b79-4a45-4271-a159-a451a0c232a7" (UID: "bb032b79-4a45-4271-a159-a451a0c232a7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:26.808854 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.808799 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "bb032b79-4a45-4271-a159-a451a0c232a7" (UID: "bb032b79-4a45-4271-a159-a451a0c232a7"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:26.809026 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.809002 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "bb032b79-4a45-4271-a159-a451a0c232a7" (UID: "bb032b79-4a45-4271-a159-a451a0c232a7"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:26.809146 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.809030 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "bb032b79-4a45-4271-a159-a451a0c232a7" (UID: "bb032b79-4a45-4271-a159-a451a0c232a7"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:26.809146 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.809056 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb032b79-4a45-4271-a159-a451a0c232a7-kube-api-access-xpcj6" (OuterVolumeSpecName: "kube-api-access-xpcj6") pod "bb032b79-4a45-4271-a159-a451a0c232a7" (UID: "bb032b79-4a45-4271-a159-a451a0c232a7"). InnerVolumeSpecName "kube-api-access-xpcj6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:35:26.809678 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.809658 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb032b79-4a45-4271-a159-a451a0c232a7-config-out" (OuterVolumeSpecName: "config-out") pod "bb032b79-4a45-4271-a159-a451a0c232a7" (UID: "bb032b79-4a45-4271-a159-a451a0c232a7"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:35:26.813615 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.813590 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "bb032b79-4a45-4271-a159-a451a0c232a7" (UID: "bb032b79-4a45-4271-a159-a451a0c232a7"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:26.819751 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.819729 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-web-config" (OuterVolumeSpecName: "web-config") pod "bb032b79-4a45-4271-a159-a451a0c232a7" (UID: "bb032b79-4a45-4271-a159-a451a0c232a7"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:26.906298 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.906275 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:26.906298 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.906296 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:26.906414 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.906307 2578 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-cluster-tls-config\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:26.906414 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.906317 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb032b79-4a45-4271-a159-a451a0c232a7-config-out\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:26.906414 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.906326 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-main-tls\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:26.906414 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.906334 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-web-config\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:26.906414 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.906348 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bb032b79-4a45-4271-a159-a451a0c232a7-alertmanager-main-db\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:26.906414 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.906358 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb032b79-4a45-4271-a159-a451a0c232a7-tls-assets\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:26.906414 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.906367 2578 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-config-volume\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:26.906414 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.906375 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xpcj6\" (UniqueName: \"kubernetes.io/projected/bb032b79-4a45-4271-a159-a451a0c232a7-kube-api-access-xpcj6\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:26.906414 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:26.906390 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb032b79-4a45-4271-a159-a451a0c232a7-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:27.029608 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.029582 2578 generic.go:358] "Generic (PLEG): container finished" podID="bb032b79-4a45-4271-a159-a451a0c232a7" containerID="d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6" exitCode=0 Apr 17 16:35:27.029608 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.029609 2578 generic.go:358] "Generic (PLEG): container finished" podID="bb032b79-4a45-4271-a159-a451a0c232a7" containerID="3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b" exitCode=0 Apr 17 16:35:27.029744 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.029670 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb032b79-4a45-4271-a159-a451a0c232a7","Type":"ContainerDied","Data":"d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6"} Apr 17 16:35:27.029744 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.029691 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.029744 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.029708 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb032b79-4a45-4271-a159-a451a0c232a7","Type":"ContainerDied","Data":"3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b"} Apr 17 16:35:27.029744 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.029720 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb032b79-4a45-4271-a159-a451a0c232a7","Type":"ContainerDied","Data":"c7d927e03b8e5e21dffc10fa23d550d69539fd8e3a69884d1c381e8005b1da45"} Apr 17 16:35:27.029744 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.029738 2578 scope.go:117] "RemoveContainer" containerID="9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1" Apr 17 16:35:27.037611 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.037596 2578 scope.go:117] "RemoveContainer" containerID="d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6" Apr 17 16:35:27.044115 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.044099 2578 scope.go:117] "RemoveContainer" containerID="ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0" Apr 17 16:35:27.050526 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.050511 2578 scope.go:117] "RemoveContainer" containerID="3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b" Apr 17 16:35:27.054927 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.054880 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:35:27.057885 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.057865 2578 scope.go:117] "RemoveContainer" containerID="545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b" Apr 17 16:35:27.058211 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.058193 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:35:27.064258 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.064240 2578 scope.go:117] "RemoveContainer" containerID="075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b" Apr 17 16:35:27.070457 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.070442 2578 scope.go:117] "RemoveContainer" containerID="c214ef817fa7251d3520c374792e0ff7f985a74cc6262d413079355e44d5f5cc" Apr 17 16:35:27.076878 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.076863 2578 scope.go:117] "RemoveContainer" containerID="9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1" Apr 17 16:35:27.077173 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:35:27.077155 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1\": container with ID starting with 9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1 not found: ID does not exist" containerID="9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1" Apr 17 16:35:27.077225 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.077181 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1"} err="failed to get container status \"9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1\": rpc error: code = NotFound desc = could not find container \"9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1\": container with ID starting with 9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1 not found: ID does not exist" Apr 17 16:35:27.077225 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.077211 2578 scope.go:117] "RemoveContainer" containerID="d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6" Apr 17 16:35:27.077467 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:35:27.077451 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6\": container with ID starting with d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6 not found: ID does not exist" containerID="d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6" Apr 17 16:35:27.077502 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.077472 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6"} err="failed to get container status \"d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6\": rpc error: code = NotFound desc = could not find container \"d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6\": container with ID starting with d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6 not found: ID does not exist" Apr 17 16:35:27.077502 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.077498 2578 scope.go:117] "RemoveContainer" containerID="ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0" Apr 17 16:35:27.077734 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:35:27.077719 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0\": container with ID starting with ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0 not found: ID does not exist" containerID="ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0" Apr 17 16:35:27.077769 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.077740 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0"} err="failed to get container status \"ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0\": rpc error: code = NotFound desc = could not find container \"ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0\": container with ID starting with ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0 not found: ID does not exist" Apr 17 16:35:27.077769 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.077756 2578 scope.go:117] "RemoveContainer" containerID="3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b" Apr 17 16:35:27.077978 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:35:27.077962 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b\": container with ID starting with 3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b not found: ID does not exist" containerID="3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b" Apr 17 16:35:27.078013 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.077983 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b"} err="failed to get container status \"3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b\": rpc error: code = NotFound desc = could not find container \"3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b\": container with ID starting with 3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b not found: ID does not exist" Apr 17 16:35:27.078013 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.078004 2578 scope.go:117] "RemoveContainer" containerID="545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b" Apr 17 16:35:27.078252 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:35:27.078237 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b\": container with ID starting with 545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b not found: ID does not exist" containerID="545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b" Apr 17 16:35:27.078296 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.078254 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b"} err="failed to get container status \"545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b\": rpc error: code = NotFound desc = could not find container \"545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b\": container with ID starting with 545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b not found: ID does not exist" Apr 17 16:35:27.078296 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.078266 2578 scope.go:117] "RemoveContainer" containerID="075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b" Apr 17 16:35:27.078517 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:35:27.078503 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b\": container with ID starting with 075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b not found: ID does not exist" containerID="075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b" Apr 17 16:35:27.078567 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.078522 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b"} err="failed to get container status \"075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b\": rpc error: code = NotFound desc = could not find container \"075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b\": container with ID starting with 075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b not found: ID does not exist" Apr 17 16:35:27.078567 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.078537 2578 scope.go:117] "RemoveContainer" containerID="c214ef817fa7251d3520c374792e0ff7f985a74cc6262d413079355e44d5f5cc" Apr 17 16:35:27.078753 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:35:27.078738 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c214ef817fa7251d3520c374792e0ff7f985a74cc6262d413079355e44d5f5cc\": container with ID starting with c214ef817fa7251d3520c374792e0ff7f985a74cc6262d413079355e44d5f5cc not found: ID does not exist" containerID="c214ef817fa7251d3520c374792e0ff7f985a74cc6262d413079355e44d5f5cc" Apr 17 16:35:27.078792 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.078756 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c214ef817fa7251d3520c374792e0ff7f985a74cc6262d413079355e44d5f5cc"} err="failed to get container status \"c214ef817fa7251d3520c374792e0ff7f985a74cc6262d413079355e44d5f5cc\": rpc error: code = NotFound desc = could not find container \"c214ef817fa7251d3520c374792e0ff7f985a74cc6262d413079355e44d5f5cc\": container with ID starting with c214ef817fa7251d3520c374792e0ff7f985a74cc6262d413079355e44d5f5cc not found: ID does not exist" Apr 17 16:35:27.078792 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.078767 2578 scope.go:117] "RemoveContainer" containerID="9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1" Apr 17 16:35:27.078954 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.078934 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1"} err="failed to get container status \"9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1\": rpc error: code = NotFound desc = could not find container \"9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1\": container with ID starting with 9bf13ae3d8ba45d0c2f9f78256185d1b805cc8382313e8bdd8e76344d65203d1 not found: ID does not exist" Apr 17 16:35:27.078989 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.078955 2578 scope.go:117] "RemoveContainer" containerID="d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6" Apr 17 16:35:27.079165 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.079147 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6"} err="failed to get container status \"d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6\": rpc error: code = NotFound desc = could not find container \"d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6\": container with ID starting with d6cdbe69dee5baa04872a4c6193d68b1451f5771a62e0b8e8c4c1c138e4804b6 not found: ID does not exist" Apr 17 16:35:27.079204 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.079166 2578 scope.go:117] "RemoveContainer" containerID="ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0" Apr 17 16:35:27.079385 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.079368 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0"} err="failed to get container status \"ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0\": rpc error: code = NotFound desc = could not find container \"ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0\": container with ID starting with ceabf0a3d5e9da695fd4cb647e4ce0d9d418a6b6841ac7fbe332065d9d92edb0 not found: ID does not exist" Apr 17 16:35:27.079385 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.079384 2578 scope.go:117] "RemoveContainer" containerID="3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b" Apr 17 16:35:27.079587 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.079569 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b"} err="failed to get container status \"3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b\": rpc error: code = NotFound desc = could not find container \"3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b\": container with ID starting with 3dd6fd4d9a7627ac0ccec8010d1bfc00611f66f913f727b4a17b7780fe40a62b not found: ID does not exist" Apr 17 16:35:27.079587 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.079587 2578 scope.go:117] "RemoveContainer" containerID="545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b" Apr 17 16:35:27.079786 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.079772 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b"} err="failed to get container status \"545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b\": rpc error: code = NotFound desc = could not find container \"545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b\": container with ID starting with 545071d96d70fd0d9ddc99285b4853e03cd2d2aeac2b59cf79f0f3873bc5475b not found: ID does not exist" Apr 17 16:35:27.079786 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.079784 2578 scope.go:117] "RemoveContainer" containerID="075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b" Apr 17 16:35:27.079981 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.079963 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b"} err="failed to get container status \"075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b\": rpc error: code = NotFound desc = could not find container \"075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b\": container with ID starting with 075cfcc7ea2a914c0fd8fa04028180e89c8d23a268be241f7b76219c4fd70d1b not found: ID does not exist" Apr 17 16:35:27.080023 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.079982 2578 scope.go:117] "RemoveContainer" containerID="c214ef817fa7251d3520c374792e0ff7f985a74cc6262d413079355e44d5f5cc" Apr 17 16:35:27.080207 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.080191 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c214ef817fa7251d3520c374792e0ff7f985a74cc6262d413079355e44d5f5cc"} err="failed to get container status \"c214ef817fa7251d3520c374792e0ff7f985a74cc6262d413079355e44d5f5cc\": rpc error: code = NotFound desc = could not find container \"c214ef817fa7251d3520c374792e0ff7f985a74cc6262d413079355e44d5f5cc\": container with ID starting with c214ef817fa7251d3520c374792e0ff7f985a74cc6262d413079355e44d5f5cc not found: ID does not exist" Apr 17 16:35:27.085937 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.085914 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:35:27.086212 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086200 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="kube-rbac-proxy-metric" Apr 17 16:35:27.086259 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086214 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="kube-rbac-proxy-metric" Apr 17 16:35:27.086259 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086225 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="alertmanager" Apr 17 16:35:27.086259 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086230 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="alertmanager" Apr 17 16:35:27.086259 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086239 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="kube-rbac-proxy" Apr 17 16:35:27.086259 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086245 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="kube-rbac-proxy" Apr 17 16:35:27.086259 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086250 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="kube-rbac-proxy-web" Apr 17 16:35:27.086259 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086255 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="kube-rbac-proxy-web" Apr 17 16:35:27.086463 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086264 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="prom-label-proxy" Apr 17 16:35:27.086463 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086269 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="prom-label-proxy" Apr 17 16:35:27.086463 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086277 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="init-config-reloader" Apr 17 16:35:27.086463 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086282 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="init-config-reloader" Apr 17 16:35:27.086463 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086291 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="config-reloader" Apr 17 16:35:27.086463 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086296 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="config-reloader" Apr 17 16:35:27.086463 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086336 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="kube-rbac-proxy-web" Apr 17 16:35:27.086463 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086343 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="kube-rbac-proxy-metric" Apr 17 16:35:27.086463 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086349 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="alertmanager" Apr 17 16:35:27.086463 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086356 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="config-reloader" Apr 17 16:35:27.086463 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086362 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="prom-label-proxy" Apr 17 16:35:27.086463 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.086370 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" containerName="kube-rbac-proxy" Apr 17 16:35:27.092693 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.092675 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.095303 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.095280 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 16:35:27.095450 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.095427 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 16:35:27.095545 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.095454 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-mscdk\"" Apr 17 16:35:27.095612 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.095546 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 16:35:27.095612 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.095544 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 16:35:27.095707 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.095636 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 16:35:27.095860 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.095844 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 16:35:27.095918 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.095907 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 16:35:27.095970 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.095920 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 16:35:27.101668 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.101650 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 16:35:27.104023 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.104000 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:35:27.169128 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.169092 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb032b79-4a45-4271-a159-a451a0c232a7" path="/var/lib/kubelet/pods/bb032b79-4a45-4271-a159-a451a0c232a7/volumes" Apr 17 16:35:27.208832 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.208807 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.208938 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.208837 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.208938 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.208856 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.208938 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.208871 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a128d97c-5289-4b89-9e74-6c42982f3eba-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.209034 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.208965 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-config-volume\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.209034 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.209016 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-web-config\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.209119 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.209045 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a128d97c-5289-4b89-9e74-6c42982f3eba-config-out\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.209119 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.209097 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mqsj\" (UniqueName: \"kubernetes.io/projected/a128d97c-5289-4b89-9e74-6c42982f3eba-kube-api-access-4mqsj\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.209179 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.209133 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a128d97c-5289-4b89-9e74-6c42982f3eba-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.209179 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.209153 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a128d97c-5289-4b89-9e74-6c42982f3eba-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.209239 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.209219 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.209270 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.209250 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.209306 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.209272 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a128d97c-5289-4b89-9e74-6c42982f3eba-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.309594 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.309531 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a128d97c-5289-4b89-9e74-6c42982f3eba-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.309594 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.309558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a128d97c-5289-4b89-9e74-6c42982f3eba-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.309594 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.309584 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.309746 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.309605 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.309778 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.309750 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a128d97c-5289-4b89-9e74-6c42982f3eba-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.309822 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.309799 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.309977 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.309878 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.309977 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.309925 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.309977 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.309952 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a128d97c-5289-4b89-9e74-6c42982f3eba-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.310176 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.309989 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-config-volume\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.310176 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.310046 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-web-config\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.310176 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.310107 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a128d97c-5289-4b89-9e74-6c42982f3eba-config-out\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.310176 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.310137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mqsj\" (UniqueName: \"kubernetes.io/projected/a128d97c-5289-4b89-9e74-6c42982f3eba-kube-api-access-4mqsj\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.310349 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.310310 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a128d97c-5289-4b89-9e74-6c42982f3eba-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.312962 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.312930 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.313110 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.312930 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.313110 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.313097 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.313222 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.313129 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.313222 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.309951 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a128d97c-5289-4b89-9e74-6c42982f3eba-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.313331 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.313245 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-config-volume\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.313496 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.313470 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-web-config\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.313801 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.313781 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a128d97c-5289-4b89-9e74-6c42982f3eba-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.313965 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.313943 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a128d97c-5289-4b89-9e74-6c42982f3eba-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.314970 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.314950 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a128d97c-5289-4b89-9e74-6c42982f3eba-config-out\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.315472 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.315454 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a128d97c-5289-4b89-9e74-6c42982f3eba-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.318850 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.318826 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mqsj\" (UniqueName: \"kubernetes.io/projected/a128d97c-5289-4b89-9e74-6c42982f3eba-kube-api-access-4mqsj\") pod \"alertmanager-main-0\" (UID: \"a128d97c-5289-4b89-9e74-6c42982f3eba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.402160 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.402140 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:35:27.556045 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:27.556018 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:35:27.559772 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:35:27.559701 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda128d97c_5289_4b89_9e74_6c42982f3eba.slice/crio-dc306a81abe8d8fb1213fe4378ba4a2e6671af9ec3384f6623fd780423c78057 WatchSource:0}: Error finding container dc306a81abe8d8fb1213fe4378ba4a2e6671af9ec3384f6623fd780423c78057: Status 404 returned error can't find the container with id dc306a81abe8d8fb1213fe4378ba4a2e6671af9ec3384f6623fd780423c78057 Apr 17 16:35:28.033388 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:28.033310 2578 generic.go:358] "Generic (PLEG): container finished" podID="a128d97c-5289-4b89-9e74-6c42982f3eba" containerID="7e80e7a0569c429d24657e57edbdda793544beb8193cbe7e26b633e26fba1a47" exitCode=0 Apr 17 16:35:28.033733 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:28.033392 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a128d97c-5289-4b89-9e74-6c42982f3eba","Type":"ContainerDied","Data":"7e80e7a0569c429d24657e57edbdda793544beb8193cbe7e26b633e26fba1a47"} Apr 17 16:35:28.033733 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:28.033426 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a128d97c-5289-4b89-9e74-6c42982f3eba","Type":"ContainerStarted","Data":"dc306a81abe8d8fb1213fe4378ba4a2e6671af9ec3384f6623fd780423c78057"} Apr 17 16:35:29.039242 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:29.039216 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a128d97c-5289-4b89-9e74-6c42982f3eba","Type":"ContainerStarted","Data":"28f2e0bf7f267dcec9bbdf01c8e5a80c71186b425f3af531d3118a324139db92"} Apr 17 16:35:29.039599 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:29.039248 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a128d97c-5289-4b89-9e74-6c42982f3eba","Type":"ContainerStarted","Data":"97d3debd0d329c60898a50cd2ca1fa5f01a4ac93f96c75757bf374ddb5e1011a"} Apr 17 16:35:29.039599 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:29.039257 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a128d97c-5289-4b89-9e74-6c42982f3eba","Type":"ContainerStarted","Data":"83d903201ec4e10524574d15070a6e6b1262eb13246baa1ef5447cc339a04f5c"} Apr 17 16:35:29.039599 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:29.039268 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a128d97c-5289-4b89-9e74-6c42982f3eba","Type":"ContainerStarted","Data":"19290cd9b4826f1442bfb65bb3c2648b04f0e62921798edee023548601bbbdef"} Apr 17 16:35:29.039599 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:29.039276 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a128d97c-5289-4b89-9e74-6c42982f3eba","Type":"ContainerStarted","Data":"0fba5ff66d6fb74d66d61054f294b3823be9c0cfd438c50cf648ccbaa99c1ac1"} Apr 17 16:35:29.039599 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:29.039285 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a128d97c-5289-4b89-9e74-6c42982f3eba","Type":"ContainerStarted","Data":"3c5271714b2877913887fbcbca54574ad2ac269437eef3cdc2fbfb5ece2fc7a3"} Apr 17 16:35:29.066911 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:29.066857 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.066839329 podStartE2EDuration="2.066839329s" podCreationTimestamp="2026-04-17 16:35:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:35:29.065585974 +0000 UTC m=+240.587162924" watchObservedRunningTime="2026-04-17 16:35:29.066839329 +0000 UTC m=+240.588416281" Apr 17 16:35:29.713764 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:29.713731 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:35:29.714215 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:29.714166 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="prometheus" containerID="cri-o://50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e" gracePeriod=600 Apr 17 16:35:29.714359 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:29.714206 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="kube-rbac-proxy" containerID="cri-o://0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763" gracePeriod=600 Apr 17 16:35:29.714359 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:29.714247 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="kube-rbac-proxy-web" containerID="cri-o://f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa" gracePeriod=600 Apr 17 16:35:29.714359 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:29.714260 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="kube-rbac-proxy-thanos" containerID="cri-o://217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9" gracePeriod=600 Apr 17 16:35:29.714359 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:29.714230 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="config-reloader" containerID="cri-o://31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf" gracePeriod=600 Apr 17 16:35:29.714594 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:29.714442 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="thanos-sidecar" containerID="cri-o://2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa" gracePeriod=600 Apr 17 16:35:29.963476 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:29.963453 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.035560 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.035470 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-kubelet-serving-ca-bundle\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.035560 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.035527 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmrcd\" (UniqueName: \"kubernetes.io/projected/47c80c4d-75d7-4e1b-82d1-c22be91802c1-kube-api-access-bmrcd\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.035756 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.035700 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-k8s-rulefiles-0\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.035756 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.035742 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-serving-certs-ca-bundle\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.035869 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.035783 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-trusted-ca-bundle\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.035869 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.035817 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-k8s-db\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.035972 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.035880 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.035972 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.035815 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:35:30.035972 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.035960 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-metrics-client-ca\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.037228 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.035984 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-config\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.037228 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.036202 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-tls\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.037228 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.036243 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-grpc-tls\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.037228 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.036289 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-thanos-prometheus-http-client-file\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.037228 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.036316 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.037228 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.036352 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-kube-rbac-proxy\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.037228 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.036399 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-web-config\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.037228 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.036434 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/47c80c4d-75d7-4e1b-82d1-c22be91802c1-tls-assets\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.037228 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.036482 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-metrics-client-certs\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.037228 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.036519 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/47c80c4d-75d7-4e1b-82d1-c22be91802c1-config-out\") pod \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\" (UID: \"47c80c4d-75d7-4e1b-82d1-c22be91802c1\") " Apr 17 16:35:30.037228 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.036161 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:35:30.037228 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.036212 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:35:30.037228 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.036423 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:35:30.037228 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.037048 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:35:30.037228 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.037185 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:35:30.038514 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.038388 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.038514 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.038419 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.038514 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.038436 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.038514 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.038452 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-trusted-ca-bundle\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.038514 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.038468 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/47c80c4d-75d7-4e1b-82d1-c22be91802c1-prometheus-k8s-db\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.038514 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.038486 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47c80c4d-75d7-4e1b-82d1-c22be91802c1-configmap-metrics-client-ca\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.038915 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.038871 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47c80c4d-75d7-4e1b-82d1-c22be91802c1-kube-api-access-bmrcd" (OuterVolumeSpecName: "kube-api-access-bmrcd") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "kube-api-access-bmrcd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:35:30.039698 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.039672 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-config" (OuterVolumeSpecName: "config") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:30.040391 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.040362 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:30.040735 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.040710 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47c80c4d-75d7-4e1b-82d1-c22be91802c1-config-out" (OuterVolumeSpecName: "config-out") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:35:30.040965 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.040938 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:30.041037 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.040947 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:30.041307 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.041280 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:30.041584 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.041550 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:30.041679 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.041595 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:30.042654 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.042628 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:30.042779 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.042760 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47c80c4d-75d7-4e1b-82d1-c22be91802c1-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:35:30.047307 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.047283 2578 generic.go:358] "Generic (PLEG): container finished" podID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerID="217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9" exitCode=0 Apr 17 16:35:30.047307 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.047306 2578 generic.go:358] "Generic (PLEG): container finished" podID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerID="0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763" exitCode=0 Apr 17 16:35:30.047418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.047312 2578 generic.go:358] "Generic (PLEG): container finished" podID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerID="f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa" exitCode=0 Apr 17 16:35:30.047418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.047319 2578 generic.go:358] "Generic (PLEG): container finished" podID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerID="2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa" exitCode=0 Apr 17 16:35:30.047418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.047324 2578 generic.go:358] "Generic (PLEG): container finished" podID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerID="31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf" exitCode=0 Apr 17 16:35:30.047418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.047329 2578 generic.go:358] "Generic (PLEG): container finished" podID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerID="50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e" exitCode=0 Apr 17 16:35:30.047418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.047364 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"47c80c4d-75d7-4e1b-82d1-c22be91802c1","Type":"ContainerDied","Data":"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9"} Apr 17 16:35:30.047418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.047401 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"47c80c4d-75d7-4e1b-82d1-c22be91802c1","Type":"ContainerDied","Data":"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763"} Apr 17 16:35:30.047418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.047404 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.047418 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.047416 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"47c80c4d-75d7-4e1b-82d1-c22be91802c1","Type":"ContainerDied","Data":"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa"} Apr 17 16:35:30.047681 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.047430 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"47c80c4d-75d7-4e1b-82d1-c22be91802c1","Type":"ContainerDied","Data":"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa"} Apr 17 16:35:30.047681 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.047444 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"47c80c4d-75d7-4e1b-82d1-c22be91802c1","Type":"ContainerDied","Data":"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf"} Apr 17 16:35:30.047681 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.047457 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"47c80c4d-75d7-4e1b-82d1-c22be91802c1","Type":"ContainerDied","Data":"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e"} Apr 17 16:35:30.047681 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.047472 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"47c80c4d-75d7-4e1b-82d1-c22be91802c1","Type":"ContainerDied","Data":"2ed79b82a6205dd031ab074d5c8ae7d8d9912752f1a4ce35cbe9e5fd773e6906"} Apr 17 16:35:30.047681 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.047491 2578 scope.go:117] "RemoveContainer" containerID="217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9" Apr 17 16:35:30.053852 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.053821 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-web-config" (OuterVolumeSpecName: "web-config") pod "47c80c4d-75d7-4e1b-82d1-c22be91802c1" (UID: "47c80c4d-75d7-4e1b-82d1-c22be91802c1"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:30.057304 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.055675 2578 scope.go:117] "RemoveContainer" containerID="0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763" Apr 17 16:35:30.062806 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.062787 2578 scope.go:117] "RemoveContainer" containerID="f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa" Apr 17 16:35:30.069097 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.069077 2578 scope.go:117] "RemoveContainer" containerID="2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa" Apr 17 16:35:30.075132 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.075116 2578 scope.go:117] "RemoveContainer" containerID="31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf" Apr 17 16:35:30.081180 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.081165 2578 scope.go:117] "RemoveContainer" containerID="50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e" Apr 17 16:35:30.087660 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.087642 2578 scope.go:117] "RemoveContainer" containerID="86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb" Apr 17 16:35:30.093713 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.093697 2578 scope.go:117] "RemoveContainer" containerID="217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9" Apr 17 16:35:30.093946 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:35:30.093928 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9\": container with ID starting with 217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9 not found: ID does not exist" containerID="217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9" Apr 17 16:35:30.094007 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.093959 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9"} err="failed to get container status \"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9\": rpc error: code = NotFound desc = could not find container \"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9\": container with ID starting with 217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9 not found: ID does not exist" Apr 17 16:35:30.094007 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.093984 2578 scope.go:117] "RemoveContainer" containerID="0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763" Apr 17 16:35:30.094259 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:35:30.094243 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763\": container with ID starting with 0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763 not found: ID does not exist" containerID="0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763" Apr 17 16:35:30.094302 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.094265 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763"} err="failed to get container status \"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763\": rpc error: code = NotFound desc = could not find container \"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763\": container with ID starting with 0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763 not found: ID does not exist" Apr 17 16:35:30.094302 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.094283 2578 scope.go:117] "RemoveContainer" containerID="f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa" Apr 17 16:35:30.094498 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:35:30.094483 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa\": container with ID starting with f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa not found: ID does not exist" containerID="f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa" Apr 17 16:35:30.094540 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.094505 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa"} err="failed to get container status \"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa\": rpc error: code = NotFound desc = could not find container \"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa\": container with ID starting with f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa not found: ID does not exist" Apr 17 16:35:30.094540 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.094526 2578 scope.go:117] "RemoveContainer" containerID="2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa" Apr 17 16:35:30.094741 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:35:30.094725 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa\": container with ID starting with 2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa not found: ID does not exist" containerID="2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa" Apr 17 16:35:30.094801 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.094750 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa"} err="failed to get container status \"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa\": rpc error: code = NotFound desc = could not find container \"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa\": container with ID starting with 2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa not found: ID does not exist" Apr 17 16:35:30.094801 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.094770 2578 scope.go:117] "RemoveContainer" containerID="31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf" Apr 17 16:35:30.094985 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:35:30.094968 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf\": container with ID starting with 31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf not found: ID does not exist" containerID="31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf" Apr 17 16:35:30.095025 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.094989 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf"} err="failed to get container status \"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf\": rpc error: code = NotFound desc = could not find container \"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf\": container with ID starting with 31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf not found: ID does not exist" Apr 17 16:35:30.095025 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.095003 2578 scope.go:117] "RemoveContainer" containerID="50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e" Apr 17 16:35:30.095349 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:35:30.095334 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e\": container with ID starting with 50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e not found: ID does not exist" containerID="50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e" Apr 17 16:35:30.095394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.095351 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e"} err="failed to get container status \"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e\": rpc error: code = NotFound desc = could not find container \"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e\": container with ID starting with 50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e not found: ID does not exist" Apr 17 16:35:30.095394 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.095363 2578 scope.go:117] "RemoveContainer" containerID="86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb" Apr 17 16:35:30.095587 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:35:30.095571 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb\": container with ID starting with 86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb not found: ID does not exist" containerID="86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb" Apr 17 16:35:30.095647 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.095594 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb"} err="failed to get container status \"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb\": rpc error: code = NotFound desc = could not find container \"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb\": container with ID starting with 86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb not found: ID does not exist" Apr 17 16:35:30.095647 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.095617 2578 scope.go:117] "RemoveContainer" containerID="217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9" Apr 17 16:35:30.095828 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.095813 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9"} err="failed to get container status \"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9\": rpc error: code = NotFound desc = could not find container \"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9\": container with ID starting with 217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9 not found: ID does not exist" Apr 17 16:35:30.095886 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.095829 2578 scope.go:117] "RemoveContainer" containerID="0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763" Apr 17 16:35:30.096020 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.096006 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763"} err="failed to get container status \"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763\": rpc error: code = NotFound desc = could not find container \"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763\": container with ID starting with 0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763 not found: ID does not exist" Apr 17 16:35:30.096134 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.096022 2578 scope.go:117] "RemoveContainer" containerID="f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa" Apr 17 16:35:30.096223 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.096205 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa"} err="failed to get container status \"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa\": rpc error: code = NotFound desc = could not find container \"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa\": container with ID starting with f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa not found: ID does not exist" Apr 17 16:35:30.096279 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.096225 2578 scope.go:117] "RemoveContainer" containerID="2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa" Apr 17 16:35:30.096437 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.096417 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa"} err="failed to get container status \"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa\": rpc error: code = NotFound desc = could not find container \"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa\": container with ID starting with 2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa not found: ID does not exist" Apr 17 16:35:30.096487 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.096438 2578 scope.go:117] "RemoveContainer" containerID="31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf" Apr 17 16:35:30.096664 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.096646 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf"} err="failed to get container status \"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf\": rpc error: code = NotFound desc = could not find container \"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf\": container with ID starting with 31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf not found: ID does not exist" Apr 17 16:35:30.096728 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.096665 2578 scope.go:117] "RemoveContainer" containerID="50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e" Apr 17 16:35:30.096892 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.096874 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e"} err="failed to get container status \"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e\": rpc error: code = NotFound desc = could not find container \"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e\": container with ID starting with 50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e not found: ID does not exist" Apr 17 16:35:30.096937 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.096893 2578 scope.go:117] "RemoveContainer" containerID="86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb" Apr 17 16:35:30.097177 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.097156 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb"} err="failed to get container status \"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb\": rpc error: code = NotFound desc = could not find container \"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb\": container with ID starting with 86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb not found: ID does not exist" Apr 17 16:35:30.097225 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.097178 2578 scope.go:117] "RemoveContainer" containerID="217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9" Apr 17 16:35:30.097382 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.097366 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9"} err="failed to get container status \"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9\": rpc error: code = NotFound desc = could not find container \"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9\": container with ID starting with 217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9 not found: ID does not exist" Apr 17 16:35:30.097441 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.097384 2578 scope.go:117] "RemoveContainer" containerID="0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763" Apr 17 16:35:30.097609 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.097592 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763"} err="failed to get container status \"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763\": rpc error: code = NotFound desc = could not find container \"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763\": container with ID starting with 0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763 not found: ID does not exist" Apr 17 16:35:30.097648 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.097610 2578 scope.go:117] "RemoveContainer" containerID="f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa" Apr 17 16:35:30.097818 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.097801 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa"} err="failed to get container status \"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa\": rpc error: code = NotFound desc = could not find container \"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa\": container with ID starting with f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa not found: ID does not exist" Apr 17 16:35:30.097864 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.097819 2578 scope.go:117] "RemoveContainer" containerID="2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa" Apr 17 16:35:30.097997 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.097981 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa"} err="failed to get container status \"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa\": rpc error: code = NotFound desc = could not find container \"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa\": container with ID starting with 2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa not found: ID does not exist" Apr 17 16:35:30.098059 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.097999 2578 scope.go:117] "RemoveContainer" containerID="31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf" Apr 17 16:35:30.098190 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.098170 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf"} err="failed to get container status \"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf\": rpc error: code = NotFound desc = could not find container \"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf\": container with ID starting with 31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf not found: ID does not exist" Apr 17 16:35:30.098239 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.098193 2578 scope.go:117] "RemoveContainer" containerID="50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e" Apr 17 16:35:30.098376 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.098360 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e"} err="failed to get container status \"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e\": rpc error: code = NotFound desc = could not find container \"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e\": container with ID starting with 50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e not found: ID does not exist" Apr 17 16:35:30.098414 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.098376 2578 scope.go:117] "RemoveContainer" containerID="86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb" Apr 17 16:35:30.098549 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.098534 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb"} err="failed to get container status \"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb\": rpc error: code = NotFound desc = could not find container \"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb\": container with ID starting with 86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb not found: ID does not exist" Apr 17 16:35:30.098598 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.098549 2578 scope.go:117] "RemoveContainer" containerID="217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9" Apr 17 16:35:30.098739 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.098724 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9"} err="failed to get container status \"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9\": rpc error: code = NotFound desc = could not find container \"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9\": container with ID starting with 217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9 not found: ID does not exist" Apr 17 16:35:30.098787 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.098739 2578 scope.go:117] "RemoveContainer" containerID="0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763" Apr 17 16:35:30.098915 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.098900 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763"} err="failed to get container status \"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763\": rpc error: code = NotFound desc = could not find container \"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763\": container with ID starting with 0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763 not found: ID does not exist" Apr 17 16:35:30.098957 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.098915 2578 scope.go:117] "RemoveContainer" containerID="f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa" Apr 17 16:35:30.099098 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.099079 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa"} err="failed to get container status \"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa\": rpc error: code = NotFound desc = could not find container \"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa\": container with ID starting with f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa not found: ID does not exist" Apr 17 16:35:30.099167 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.099100 2578 scope.go:117] "RemoveContainer" containerID="2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa" Apr 17 16:35:30.099295 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.099280 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa"} err="failed to get container status \"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa\": rpc error: code = NotFound desc = could not find container \"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa\": container with ID starting with 2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa not found: ID does not exist" Apr 17 16:35:30.099360 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.099296 2578 scope.go:117] "RemoveContainer" containerID="31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf" Apr 17 16:35:30.099492 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.099476 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf"} err="failed to get container status \"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf\": rpc error: code = NotFound desc = could not find container \"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf\": container with ID starting with 31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf not found: ID does not exist" Apr 17 16:35:30.099531 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.099494 2578 scope.go:117] "RemoveContainer" containerID="50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e" Apr 17 16:35:30.099671 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.099656 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e"} err="failed to get container status \"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e\": rpc error: code = NotFound desc = could not find container \"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e\": container with ID starting with 50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e not found: ID does not exist" Apr 17 16:35:30.099710 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.099671 2578 scope.go:117] "RemoveContainer" containerID="86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb" Apr 17 16:35:30.099843 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.099829 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb"} err="failed to get container status \"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb\": rpc error: code = NotFound desc = could not find container \"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb\": container with ID starting with 86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb not found: ID does not exist" Apr 17 16:35:30.099888 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.099844 2578 scope.go:117] "RemoveContainer" containerID="217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9" Apr 17 16:35:30.100029 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.100014 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9"} err="failed to get container status \"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9\": rpc error: code = NotFound desc = could not find container \"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9\": container with ID starting with 217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9 not found: ID does not exist" Apr 17 16:35:30.100098 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.100029 2578 scope.go:117] "RemoveContainer" containerID="0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763" Apr 17 16:35:30.100217 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.100203 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763"} err="failed to get container status \"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763\": rpc error: code = NotFound desc = could not find container \"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763\": container with ID starting with 0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763 not found: ID does not exist" Apr 17 16:35:30.100265 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.100218 2578 scope.go:117] "RemoveContainer" containerID="f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa" Apr 17 16:35:30.100386 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.100372 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa"} err="failed to get container status \"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa\": rpc error: code = NotFound desc = could not find container \"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa\": container with ID starting with f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa not found: ID does not exist" Apr 17 16:35:30.100434 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.100386 2578 scope.go:117] "RemoveContainer" containerID="2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa" Apr 17 16:35:30.100554 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.100538 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa"} err="failed to get container status \"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa\": rpc error: code = NotFound desc = could not find container \"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa\": container with ID starting with 2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa not found: ID does not exist" Apr 17 16:35:30.100596 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.100555 2578 scope.go:117] "RemoveContainer" containerID="31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf" Apr 17 16:35:30.100713 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.100699 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf"} err="failed to get container status \"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf\": rpc error: code = NotFound desc = could not find container \"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf\": container with ID starting with 31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf not found: ID does not exist" Apr 17 16:35:30.100757 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.100713 2578 scope.go:117] "RemoveContainer" containerID="50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e" Apr 17 16:35:30.100870 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.100857 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e"} err="failed to get container status \"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e\": rpc error: code = NotFound desc = could not find container \"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e\": container with ID starting with 50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e not found: ID does not exist" Apr 17 16:35:30.100918 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.100870 2578 scope.go:117] "RemoveContainer" containerID="86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb" Apr 17 16:35:30.101033 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.101020 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb"} err="failed to get container status \"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb\": rpc error: code = NotFound desc = could not find container \"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb\": container with ID starting with 86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb not found: ID does not exist" Apr 17 16:35:30.101101 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.101033 2578 scope.go:117] "RemoveContainer" containerID="217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9" Apr 17 16:35:30.101248 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.101229 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9"} err="failed to get container status \"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9\": rpc error: code = NotFound desc = could not find container \"217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9\": container with ID starting with 217ed1c3430ee21fcf1cc154c8546c5262ebe7b1040e956a36b5472193d116a9 not found: ID does not exist" Apr 17 16:35:30.101316 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.101249 2578 scope.go:117] "RemoveContainer" containerID="0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763" Apr 17 16:35:30.101474 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.101456 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763"} err="failed to get container status \"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763\": rpc error: code = NotFound desc = could not find container \"0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763\": container with ID starting with 0da0ef3f593e5d18a9fdad556e35465838867dc708ba2a1e2b939a44f9d9d763 not found: ID does not exist" Apr 17 16:35:30.101530 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.101475 2578 scope.go:117] "RemoveContainer" containerID="f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa" Apr 17 16:35:30.101684 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.101666 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa"} err="failed to get container status \"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa\": rpc error: code = NotFound desc = could not find container \"f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa\": container with ID starting with f5c2496dc59398643924e01a8d847debf229086851d8a39999c41919f6b355aa not found: ID does not exist" Apr 17 16:35:30.101726 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.101687 2578 scope.go:117] "RemoveContainer" containerID="2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa" Apr 17 16:35:30.101908 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.101891 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa"} err="failed to get container status \"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa\": rpc error: code = NotFound desc = could not find container \"2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa\": container with ID starting with 2ee47f553467f12eacf81841b8b8a07d31720fad886786e10520c2c66ca179aa not found: ID does not exist" Apr 17 16:35:30.102006 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.101911 2578 scope.go:117] "RemoveContainer" containerID="31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf" Apr 17 16:35:30.102178 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.102161 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf"} err="failed to get container status \"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf\": rpc error: code = NotFound desc = could not find container \"31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf\": container with ID starting with 31355e79e25225f1ac935eed82e829da424c26646120ef62f24d7bb93e63a3bf not found: ID does not exist" Apr 17 16:35:30.102225 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.102179 2578 scope.go:117] "RemoveContainer" containerID="50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e" Apr 17 16:35:30.102372 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.102354 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e"} err="failed to get container status \"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e\": rpc error: code = NotFound desc = could not find container \"50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e\": container with ID starting with 50419acbfd739835c551dfb70f6dfe9d960f015c5fa872678cd9eb4c426b880e not found: ID does not exist" Apr 17 16:35:30.102431 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.102373 2578 scope.go:117] "RemoveContainer" containerID="86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb" Apr 17 16:35:30.102581 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.102560 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb"} err="failed to get container status \"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb\": rpc error: code = NotFound desc = could not find container \"86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb\": container with ID starting with 86c9b7e919c0f4b0c5e8e2a8567ab8f8ac2ea96b432e4b54798ae6bd6c1db4eb not found: ID does not exist" Apr 17 16:35:30.139408 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.139385 2578 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-grpc-tls\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.139480 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.139414 2578 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-thanos-prometheus-http-client-file\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.139480 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.139431 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.139480 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.139447 2578 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-kube-rbac-proxy\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.139480 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.139461 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-web-config\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.139480 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.139473 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/47c80c4d-75d7-4e1b-82d1-c22be91802c1-tls-assets\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.139642 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.139486 2578 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-metrics-client-certs\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.139642 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.139500 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/47c80c4d-75d7-4e1b-82d1-c22be91802c1-config-out\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.139642 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.139514 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bmrcd\" (UniqueName: \"kubernetes.io/projected/47c80c4d-75d7-4e1b-82d1-c22be91802c1-kube-api-access-bmrcd\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.139642 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.139529 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.139642 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.139544 2578 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-config\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.139642 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.139566 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/47c80c4d-75d7-4e1b-82d1-c22be91802c1-secret-prometheus-k8s-tls\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:35:30.371292 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.371269 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:35:30.374920 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.374899 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:35:30.425990 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.425967 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:35:30.426259 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426245 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="init-config-reloader" Apr 17 16:35:30.426259 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426260 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="init-config-reloader" Apr 17 16:35:30.426327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426269 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="prometheus" Apr 17 16:35:30.426327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426274 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="prometheus" Apr 17 16:35:30.426327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426281 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="kube-rbac-proxy-thanos" Apr 17 16:35:30.426327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426286 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="kube-rbac-proxy-thanos" Apr 17 16:35:30.426327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426296 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="config-reloader" Apr 17 16:35:30.426327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426302 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="config-reloader" Apr 17 16:35:30.426327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426311 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="kube-rbac-proxy" Apr 17 16:35:30.426327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426316 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="kube-rbac-proxy" Apr 17 16:35:30.426327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426323 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="kube-rbac-proxy-web" Apr 17 16:35:30.426327 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426329 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="kube-rbac-proxy-web" Apr 17 16:35:30.426637 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426339 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="thanos-sidecar" Apr 17 16:35:30.426637 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426344 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="thanos-sidecar" Apr 17 16:35:30.426637 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426411 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="kube-rbac-proxy-web" Apr 17 16:35:30.426637 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426420 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="thanos-sidecar" Apr 17 16:35:30.426637 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426428 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="kube-rbac-proxy-thanos" Apr 17 16:35:30.426637 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426435 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="config-reloader" Apr 17 16:35:30.426637 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426442 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="kube-rbac-proxy" Apr 17 16:35:30.426637 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.426450 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" containerName="prometheus" Apr 17 16:35:30.431248 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.431233 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.437685 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.437663 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 16:35:30.437811 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.437791 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 16:35:30.439475 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.439455 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 16:35:30.439819 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.439796 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 16:35:30.440128 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.439901 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 16:35:30.440128 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.439909 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 16:35:30.440128 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.439936 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 16:35:30.440128 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.439936 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 16:35:30.440128 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.440024 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 16:35:30.440323 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.440289 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 16:35:30.440363 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.440354 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 16:35:30.440927 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.440909 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 16:35:30.445808 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.445795 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5a2bpae6g8khu\"" Apr 17 16:35:30.445970 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.445959 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-8f9dt\"" Apr 17 16:35:30.448480 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.448466 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 16:35:30.519398 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.519368 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:35:30.542201 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542178 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.542284 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542208 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.542284 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542230 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-config-out\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.542284 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542279 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.542385 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542307 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.542385 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542348 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-config\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.542385 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542370 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.542476 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542386 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.542476 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542405 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.542476 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542437 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.542476 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542462 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.542615 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542478 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.542615 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542503 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-web-config\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.542615 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542524 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.542615 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542540 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5cr8\" (UniqueName: \"kubernetes.io/projected/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-kube-api-access-r5cr8\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.542615 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542576 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.542615 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542605 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.542799 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.542626 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643182 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643130 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643182 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643159 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643182 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643179 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643324 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643201 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-web-config\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643324 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643222 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643399 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643346 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5cr8\" (UniqueName: \"kubernetes.io/projected/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-kube-api-access-r5cr8\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643399 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643378 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643486 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643408 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643486 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643436 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643486 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643476 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643628 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643503 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643628 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643529 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-config-out\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643628 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643575 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643628 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643607 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643841 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643644 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-config\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643841 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643685 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643841 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643712 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.643841 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643740 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.644634 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.644356 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.644634 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.643575 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.646656 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.646578 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.646934 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.646907 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.647265 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.647241 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.647666 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.647644 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.648959 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.648932 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.649112 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.649058 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-web-config\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.649194 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.649141 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.649826 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.649779 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.650609 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.650580 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.650784 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.650611 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.650962 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.650897 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.651381 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.651354 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-config\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.653144 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.651705 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.653144 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.652433 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-config-out\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.653144 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.652908 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5cr8\" (UniqueName: \"kubernetes.io/projected/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-kube-api-access-r5cr8\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.654157 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.653955 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.740756 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.740647 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.873273 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:30.873245 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:35:30.876747 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:35:30.876717 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7ecf39_be02_4bdd_8d5f_bed43dd9c24d.slice/crio-8446e09eea2dc2396809b22e9eaaf54850ab80fa70ecb9c67a8269e549b5827a WatchSource:0}: Error finding container 8446e09eea2dc2396809b22e9eaaf54850ab80fa70ecb9c67a8269e549b5827a: Status 404 returned error can't find the container with id 8446e09eea2dc2396809b22e9eaaf54850ab80fa70ecb9c67a8269e549b5827a Apr 17 16:35:31.052456 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:31.052423 2578 generic.go:358] "Generic (PLEG): container finished" podID="1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d" containerID="36185eb63e1c1314809fcefe242a3f5cafda13bf1e8e6f61ce9f62c0fa573a53" exitCode=0 Apr 17 16:35:31.052839 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:31.052498 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d","Type":"ContainerDied","Data":"36185eb63e1c1314809fcefe242a3f5cafda13bf1e8e6f61ce9f62c0fa573a53"} Apr 17 16:35:31.052839 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:31.052523 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d","Type":"ContainerStarted","Data":"8446e09eea2dc2396809b22e9eaaf54850ab80fa70ecb9c67a8269e549b5827a"} Apr 17 16:35:31.172920 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:31.172885 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47c80c4d-75d7-4e1b-82d1-c22be91802c1" path="/var/lib/kubelet/pods/47c80c4d-75d7-4e1b-82d1-c22be91802c1/volumes" Apr 17 16:35:32.058492 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:32.058450 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d","Type":"ContainerStarted","Data":"2907d9b91d97cb0882b48bfb8b942cede122b873a704e79800317789f88d9e84"} Apr 17 16:35:32.058492 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:32.058490 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d","Type":"ContainerStarted","Data":"c6607ef233a4c6439b1bc95938ffc919bd1d2875d8e29ab3842a5855f1b20ead"} Apr 17 16:35:32.058930 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:32.058502 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d","Type":"ContainerStarted","Data":"862888b03fd1c8ccc623931dc84aac32733f50f010b277bd8d63fbcaa08e88bd"} Apr 17 16:35:32.058930 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:32.058513 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d","Type":"ContainerStarted","Data":"d556cbd8caccd93be524c31b1d98f03171bfb7e297b1642bf017727b8f223d51"} Apr 17 16:35:32.058930 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:32.058523 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d","Type":"ContainerStarted","Data":"be001fb02dcec3baf1d29124627ba04f4357425ad722fb59231e8c58b16ea7e4"} Apr 17 16:35:32.058930 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:32.058534 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d","Type":"ContainerStarted","Data":"c99c01aec6bfeb31d76b69c1b96a74efbc616e9e086647647edf461a4e466518"} Apr 17 16:35:32.091053 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:32.091005 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.090990938 podStartE2EDuration="2.090990938s" podCreationTimestamp="2026-04-17 16:35:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:35:32.089784171 +0000 UTC m=+243.611361147" watchObservedRunningTime="2026-04-17 16:35:32.090990938 +0000 UTC m=+243.612567888" Apr 17 16:35:35.740967 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:35.740924 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:39.919033 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:39.918950 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs\") pod \"network-metrics-daemon-598xw\" (UID: \"a6f8630a-c602-4066-a1c1-66f602f947fc\") " pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:35:39.921365 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:39.921337 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f8630a-c602-4066-a1c1-66f602f947fc-metrics-certs\") pod \"network-metrics-daemon-598xw\" (UID: \"a6f8630a-c602-4066-a1c1-66f602f947fc\") " pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:35:40.170572 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:40.170494 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xlgkd\"" Apr 17 16:35:40.178376 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:40.178353 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-598xw" Apr 17 16:35:40.303309 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:40.303283 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-598xw"] Apr 17 16:35:40.307624 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:35:40.307598 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6f8630a_c602_4066_a1c1_66f602f947fc.slice/crio-6f30266d653b1b93769bb057a82a4c56f9adebdac1bc4752cac2184e73f06ef7 WatchSource:0}: Error finding container 6f30266d653b1b93769bb057a82a4c56f9adebdac1bc4752cac2184e73f06ef7: Status 404 returned error can't find the container with id 6f30266d653b1b93769bb057a82a4c56f9adebdac1bc4752cac2184e73f06ef7 Apr 17 16:35:41.092621 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:41.092587 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-598xw" event={"ID":"a6f8630a-c602-4066-a1c1-66f602f947fc","Type":"ContainerStarted","Data":"6f30266d653b1b93769bb057a82a4c56f9adebdac1bc4752cac2184e73f06ef7"} Apr 17 16:35:42.098137 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:42.098104 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-598xw" event={"ID":"a6f8630a-c602-4066-a1c1-66f602f947fc","Type":"ContainerStarted","Data":"d3db34122ac46bc6ff6587a184b4c0eaff3c22433368600c3928c628a93917a3"} Apr 17 16:35:42.098137 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:42.098141 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-598xw" event={"ID":"a6f8630a-c602-4066-a1c1-66f602f947fc","Type":"ContainerStarted","Data":"00bfb5e3465dabbfd2900d6311b832170ae957525d0975418732f407c955cbb1"} Apr 17 16:35:42.114056 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:35:42.114005 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-598xw" podStartSLOduration=252.173770875 podStartE2EDuration="4m13.113991729s" podCreationTimestamp="2026-04-17 16:31:29 +0000 UTC" firstStartedPulling="2026-04-17 16:35:40.309580471 +0000 UTC m=+251.831157414" lastFinishedPulling="2026-04-17 16:35:41.249801341 +0000 UTC m=+252.771378268" observedRunningTime="2026-04-17 16:35:42.112696938 +0000 UTC m=+253.634273888" watchObservedRunningTime="2026-04-17 16:35:42.113991729 +0000 UTC m=+253.635568681" Apr 17 16:36:29.026439 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:36:29.026409 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 16:36:29.026439 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:36:29.026424 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 16:36:29.036944 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:36:29.036925 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 16:36:29.037123 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:36:29.037105 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 16:36:29.041692 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:36:29.041675 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 16:36:30.741861 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:36:30.741829 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:36:30.758395 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:36:30.758370 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:36:31.252322 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:36:31.252296 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:37:35.250324 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:35.250245 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-lh87q"] Apr 17 16:37:35.252456 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:35.252440 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lh87q" Apr 17 16:37:35.255483 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:35.255458 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 16:37:35.306381 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:35.306353 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lh87q"] Apr 17 16:37:35.392745 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:35.392722 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7a2aba92-9d01-443d-82e4-0f79cdef282b-kubelet-config\") pod \"global-pull-secret-syncer-lh87q\" (UID: \"7a2aba92-9d01-443d-82e4-0f79cdef282b\") " pod="kube-system/global-pull-secret-syncer-lh87q" Apr 17 16:37:35.392857 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:35.392762 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7a2aba92-9d01-443d-82e4-0f79cdef282b-dbus\") pod \"global-pull-secret-syncer-lh87q\" (UID: \"7a2aba92-9d01-443d-82e4-0f79cdef282b\") " pod="kube-system/global-pull-secret-syncer-lh87q" Apr 17 16:37:35.392857 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:35.392814 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7a2aba92-9d01-443d-82e4-0f79cdef282b-original-pull-secret\") pod \"global-pull-secret-syncer-lh87q\" (UID: \"7a2aba92-9d01-443d-82e4-0f79cdef282b\") " pod="kube-system/global-pull-secret-syncer-lh87q" Apr 17 16:37:35.493979 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:35.493944 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7a2aba92-9d01-443d-82e4-0f79cdef282b-kubelet-config\") pod \"global-pull-secret-syncer-lh87q\" (UID: \"7a2aba92-9d01-443d-82e4-0f79cdef282b\") " pod="kube-system/global-pull-secret-syncer-lh87q" Apr 17 16:37:35.494153 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:35.493995 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7a2aba92-9d01-443d-82e4-0f79cdef282b-dbus\") pod \"global-pull-secret-syncer-lh87q\" (UID: \"7a2aba92-9d01-443d-82e4-0f79cdef282b\") " pod="kube-system/global-pull-secret-syncer-lh87q" Apr 17 16:37:35.494153 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:35.494101 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7a2aba92-9d01-443d-82e4-0f79cdef282b-kubelet-config\") pod \"global-pull-secret-syncer-lh87q\" (UID: \"7a2aba92-9d01-443d-82e4-0f79cdef282b\") " pod="kube-system/global-pull-secret-syncer-lh87q" Apr 17 16:37:35.494153 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:35.494144 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7a2aba92-9d01-443d-82e4-0f79cdef282b-dbus\") pod \"global-pull-secret-syncer-lh87q\" (UID: \"7a2aba92-9d01-443d-82e4-0f79cdef282b\") " pod="kube-system/global-pull-secret-syncer-lh87q" Apr 17 16:37:35.494153 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:35.494148 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7a2aba92-9d01-443d-82e4-0f79cdef282b-original-pull-secret\") pod \"global-pull-secret-syncer-lh87q\" (UID: \"7a2aba92-9d01-443d-82e4-0f79cdef282b\") " pod="kube-system/global-pull-secret-syncer-lh87q" Apr 17 16:37:35.496513 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:35.496490 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7a2aba92-9d01-443d-82e4-0f79cdef282b-original-pull-secret\") pod \"global-pull-secret-syncer-lh87q\" (UID: \"7a2aba92-9d01-443d-82e4-0f79cdef282b\") " pod="kube-system/global-pull-secret-syncer-lh87q" Apr 17 16:37:35.561665 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:35.561646 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lh87q" Apr 17 16:37:35.679617 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:35.679588 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lh87q"] Apr 17 16:37:35.683265 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:37:35.683239 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a2aba92_9d01_443d_82e4_0f79cdef282b.slice/crio-a6bf1fdfaacbdc29f1a43ca911df8e2e3227e794fc7ba7bfccb7e7770b008984 WatchSource:0}: Error finding container a6bf1fdfaacbdc29f1a43ca911df8e2e3227e794fc7ba7bfccb7e7770b008984: Status 404 returned error can't find the container with id a6bf1fdfaacbdc29f1a43ca911df8e2e3227e794fc7ba7bfccb7e7770b008984 Apr 17 16:37:35.684859 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:35.684844 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:37:36.425875 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:36.425839 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lh87q" event={"ID":"7a2aba92-9d01-443d-82e4-0f79cdef282b","Type":"ContainerStarted","Data":"a6bf1fdfaacbdc29f1a43ca911df8e2e3227e794fc7ba7bfccb7e7770b008984"} Apr 17 16:37:39.435711 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:39.435626 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lh87q" event={"ID":"7a2aba92-9d01-443d-82e4-0f79cdef282b","Type":"ContainerStarted","Data":"901f897ddaecb5b47a4c6a467943d7929eab40a1c324ed9a686550843cc82c6d"} Apr 17 16:37:39.452442 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:37:39.452392 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-lh87q" podStartSLOduration=1.091606552 podStartE2EDuration="4.45237822s" podCreationTimestamp="2026-04-17 16:37:35 +0000 UTC" firstStartedPulling="2026-04-17 16:37:35.684979625 +0000 UTC m=+367.206556554" lastFinishedPulling="2026-04-17 16:37:39.045751292 +0000 UTC m=+370.567328222" observedRunningTime="2026-04-17 16:37:39.450871064 +0000 UTC m=+370.972448013" watchObservedRunningTime="2026-04-17 16:37:39.45237822 +0000 UTC m=+370.973955170" Apr 17 16:40:00.878574 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:00.878543 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-97mcb"] Apr 17 16:40:00.881775 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:00.881759 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-97mcb" Apr 17 16:40:00.884118 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:00.884097 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 16:40:00.884844 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:00.884821 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 16:40:00.884961 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:00.884865 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 17 16:40:00.884961 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:00.884895 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-cjgp5\"" Apr 17 16:40:00.890681 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:00.890661 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-97mcb"] Apr 17 16:40:01.018686 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:01.018657 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wzt5\" (UniqueName: \"kubernetes.io/projected/b6fbe312-3c47-4597-97d9-f46977eba305-kube-api-access-9wzt5\") pod \"llmisvc-controller-manager-68cc5db7c4-97mcb\" (UID: \"b6fbe312-3c47-4597-97d9-f46977eba305\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-97mcb" Apr 17 16:40:01.018823 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:01.018705 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6fbe312-3c47-4597-97d9-f46977eba305-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-97mcb\" (UID: \"b6fbe312-3c47-4597-97d9-f46977eba305\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-97mcb" Apr 17 16:40:01.120022 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:01.119994 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wzt5\" (UniqueName: \"kubernetes.io/projected/b6fbe312-3c47-4597-97d9-f46977eba305-kube-api-access-9wzt5\") pod \"llmisvc-controller-manager-68cc5db7c4-97mcb\" (UID: \"b6fbe312-3c47-4597-97d9-f46977eba305\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-97mcb" Apr 17 16:40:01.120160 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:01.120034 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6fbe312-3c47-4597-97d9-f46977eba305-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-97mcb\" (UID: \"b6fbe312-3c47-4597-97d9-f46977eba305\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-97mcb" Apr 17 16:40:01.122538 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:01.122512 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6fbe312-3c47-4597-97d9-f46977eba305-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-97mcb\" (UID: \"b6fbe312-3c47-4597-97d9-f46977eba305\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-97mcb" Apr 17 16:40:01.127476 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:01.127450 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wzt5\" (UniqueName: \"kubernetes.io/projected/b6fbe312-3c47-4597-97d9-f46977eba305-kube-api-access-9wzt5\") pod \"llmisvc-controller-manager-68cc5db7c4-97mcb\" (UID: \"b6fbe312-3c47-4597-97d9-f46977eba305\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-97mcb" Apr 17 16:40:01.191243 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:01.191178 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-97mcb" Apr 17 16:40:01.315371 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:01.315347 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-97mcb"] Apr 17 16:40:01.317428 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:40:01.317400 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb6fbe312_3c47_4597_97d9_f46977eba305.slice/crio-f32ec04b52880d2564a45f94a010b47c863f91714873341e5f606376b3094b6c WatchSource:0}: Error finding container f32ec04b52880d2564a45f94a010b47c863f91714873341e5f606376b3094b6c: Status 404 returned error can't find the container with id f32ec04b52880d2564a45f94a010b47c863f91714873341e5f606376b3094b6c Apr 17 16:40:01.837300 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:01.837271 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-97mcb" event={"ID":"b6fbe312-3c47-4597-97d9-f46977eba305","Type":"ContainerStarted","Data":"f32ec04b52880d2564a45f94a010b47c863f91714873341e5f606376b3094b6c"} Apr 17 16:40:03.846001 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:03.845971 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-97mcb" event={"ID":"b6fbe312-3c47-4597-97d9-f46977eba305","Type":"ContainerStarted","Data":"5d999451f3c4bbecd009c69671054a2df2efc0de4383f215ff696f4c0f64efca"} Apr 17 16:40:03.846372 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:03.846105 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-97mcb" Apr 17 16:40:03.864353 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:03.864304 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-97mcb" podStartSLOduration=1.848660193 podStartE2EDuration="3.864291664s" podCreationTimestamp="2026-04-17 16:40:00 +0000 UTC" firstStartedPulling="2026-04-17 16:40:01.318750026 +0000 UTC m=+512.840326954" lastFinishedPulling="2026-04-17 16:40:03.334381498 +0000 UTC m=+514.855958425" observedRunningTime="2026-04-17 16:40:03.863041123 +0000 UTC m=+515.384618073" watchObservedRunningTime="2026-04-17 16:40:03.864291664 +0000 UTC m=+515.385868642" Apr 17 16:40:34.850793 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:40:34.850721 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-97mcb" Apr 17 16:41:24.976919 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:24.976886 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-fk4r2"] Apr 17 16:41:24.980092 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:24.980056 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-fk4r2" Apr 17 16:41:24.982372 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:24.982350 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 16:41:24.982372 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:24.982367 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-5rxj5\"" Apr 17 16:41:24.986196 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:24.986164 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-fk4r2"] Apr 17 16:41:25.051648 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:25.051622 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhl2n\" (UniqueName: \"kubernetes.io/projected/06b869af-d630-4292-899b-5c96d1f04f1c-kube-api-access-mhl2n\") pod \"s3-init-fk4r2\" (UID: \"06b869af-d630-4292-899b-5c96d1f04f1c\") " pod="kserve/s3-init-fk4r2" Apr 17 16:41:25.152769 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:25.152733 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhl2n\" (UniqueName: \"kubernetes.io/projected/06b869af-d630-4292-899b-5c96d1f04f1c-kube-api-access-mhl2n\") pod \"s3-init-fk4r2\" (UID: \"06b869af-d630-4292-899b-5c96d1f04f1c\") " pod="kserve/s3-init-fk4r2" Apr 17 16:41:25.161198 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:25.161167 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhl2n\" (UniqueName: \"kubernetes.io/projected/06b869af-d630-4292-899b-5c96d1f04f1c-kube-api-access-mhl2n\") pod \"s3-init-fk4r2\" (UID: \"06b869af-d630-4292-899b-5c96d1f04f1c\") " pod="kserve/s3-init-fk4r2" Apr 17 16:41:25.295094 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:25.295057 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-fk4r2" Apr 17 16:41:25.415952 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:25.415905 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-fk4r2"] Apr 17 16:41:25.419799 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:41:25.419767 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06b869af_d630_4292_899b_5c96d1f04f1c.slice/crio-94d4ca1581af2e7fe57d411656c46dd3e457bcc059c5ebac90f7f9f4666bd50f WatchSource:0}: Error finding container 94d4ca1581af2e7fe57d411656c46dd3e457bcc059c5ebac90f7f9f4666bd50f: Status 404 returned error can't find the container with id 94d4ca1581af2e7fe57d411656c46dd3e457bcc059c5ebac90f7f9f4666bd50f Apr 17 16:41:26.090183 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:26.090119 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-fk4r2" event={"ID":"06b869af-d630-4292-899b-5c96d1f04f1c","Type":"ContainerStarted","Data":"94d4ca1581af2e7fe57d411656c46dd3e457bcc059c5ebac90f7f9f4666bd50f"} Apr 17 16:41:29.984431 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:29.984393 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 16:41:29.985113 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:29.985088 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 16:41:29.990913 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:29.990891 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 16:41:29.991910 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:29.991889 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 16:41:30.028978 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:30.028961 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 16:41:30.104137 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:30.104108 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-fk4r2" event={"ID":"06b869af-d630-4292-899b-5c96d1f04f1c","Type":"ContainerStarted","Data":"6e23fc495ce720b6f261ade9eafd881c67f04f07e903a1a3bb80bb67d73b27b4"} Apr 17 16:41:30.119242 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:30.119201 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-fk4r2" podStartSLOduration=1.513955728 podStartE2EDuration="6.119186992s" podCreationTimestamp="2026-04-17 16:41:24 +0000 UTC" firstStartedPulling="2026-04-17 16:41:25.421549152 +0000 UTC m=+596.943126080" lastFinishedPulling="2026-04-17 16:41:30.026780414 +0000 UTC m=+601.548357344" observedRunningTime="2026-04-17 16:41:30.117689435 +0000 UTC m=+601.639266385" watchObservedRunningTime="2026-04-17 16:41:30.119186992 +0000 UTC m=+601.640763942" Apr 17 16:41:33.114265 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:33.114233 2578 generic.go:358] "Generic (PLEG): container finished" podID="06b869af-d630-4292-899b-5c96d1f04f1c" containerID="6e23fc495ce720b6f261ade9eafd881c67f04f07e903a1a3bb80bb67d73b27b4" exitCode=0 Apr 17 16:41:33.114641 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:33.114289 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-fk4r2" event={"ID":"06b869af-d630-4292-899b-5c96d1f04f1c","Type":"ContainerDied","Data":"6e23fc495ce720b6f261ade9eafd881c67f04f07e903a1a3bb80bb67d73b27b4"} Apr 17 16:41:34.241357 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:34.241334 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-fk4r2" Apr 17 16:41:34.327745 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:34.327720 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhl2n\" (UniqueName: \"kubernetes.io/projected/06b869af-d630-4292-899b-5c96d1f04f1c-kube-api-access-mhl2n\") pod \"06b869af-d630-4292-899b-5c96d1f04f1c\" (UID: \"06b869af-d630-4292-899b-5c96d1f04f1c\") " Apr 17 16:41:34.329917 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:34.329893 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b869af-d630-4292-899b-5c96d1f04f1c-kube-api-access-mhl2n" (OuterVolumeSpecName: "kube-api-access-mhl2n") pod "06b869af-d630-4292-899b-5c96d1f04f1c" (UID: "06b869af-d630-4292-899b-5c96d1f04f1c"). InnerVolumeSpecName "kube-api-access-mhl2n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:41:34.428687 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:34.428620 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mhl2n\" (UniqueName: \"kubernetes.io/projected/06b869af-d630-4292-899b-5c96d1f04f1c-kube-api-access-mhl2n\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:41:35.120866 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:35.120835 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-fk4r2" Apr 17 16:41:35.120866 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:35.120845 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-fk4r2" event={"ID":"06b869af-d630-4292-899b-5c96d1f04f1c","Type":"ContainerDied","Data":"94d4ca1581af2e7fe57d411656c46dd3e457bcc059c5ebac90f7f9f4666bd50f"} Apr 17 16:41:35.120866 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:35.120873 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94d4ca1581af2e7fe57d411656c46dd3e457bcc059c5ebac90f7f9f4666bd50f" Apr 17 16:41:45.355425 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.355347 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl"] Apr 17 16:41:45.355862 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.355673 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06b869af-d630-4292-899b-5c96d1f04f1c" containerName="s3-init" Apr 17 16:41:45.355862 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.355685 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b869af-d630-4292-899b-5c96d1f04f1c" containerName="s3-init" Apr 17 16:41:45.355862 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.355744 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="06b869af-d630-4292-899b-5c96d1f04f1c" containerName="s3-init" Apr 17 16:41:45.358867 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.358851 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:41:45.362111 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.362050 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 16:41:45.362111 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.362051 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\"" Apr 17 16:41:45.362111 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.362051 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tvtvv\"" Apr 17 16:41:45.362351 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.362053 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-predictor-serving-cert\"" Apr 17 16:41:45.362351 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.362204 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 16:41:45.368941 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.368920 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl"] Apr 17 16:41:45.508888 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.508860 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a947f5ca-ee30-454a-8725-075f92f19468-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl\" (UID: \"a947f5ca-ee30-454a-8725-075f92f19468\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:41:45.509032 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.508899 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb5vh\" (UniqueName: \"kubernetes.io/projected/a947f5ca-ee30-454a-8725-075f92f19468-kube-api-access-jb5vh\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl\" (UID: \"a947f5ca-ee30-454a-8725-075f92f19468\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:41:45.509032 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.508973 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a947f5ca-ee30-454a-8725-075f92f19468-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl\" (UID: \"a947f5ca-ee30-454a-8725-075f92f19468\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:41:45.509032 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.509023 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a947f5ca-ee30-454a-8725-075f92f19468-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl\" (UID: \"a947f5ca-ee30-454a-8725-075f92f19468\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:41:45.609553 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.609477 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a947f5ca-ee30-454a-8725-075f92f19468-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl\" (UID: \"a947f5ca-ee30-454a-8725-075f92f19468\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:41:45.609553 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.609522 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a947f5ca-ee30-454a-8725-075f92f19468-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl\" (UID: \"a947f5ca-ee30-454a-8725-075f92f19468\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:41:45.609727 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.609643 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a947f5ca-ee30-454a-8725-075f92f19468-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl\" (UID: \"a947f5ca-ee30-454a-8725-075f92f19468\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:41:45.609727 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.609695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jb5vh\" (UniqueName: \"kubernetes.io/projected/a947f5ca-ee30-454a-8725-075f92f19468-kube-api-access-jb5vh\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl\" (UID: \"a947f5ca-ee30-454a-8725-075f92f19468\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:41:45.609903 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.609885 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a947f5ca-ee30-454a-8725-075f92f19468-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl\" (UID: \"a947f5ca-ee30-454a-8725-075f92f19468\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:41:45.610191 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.610171 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a947f5ca-ee30-454a-8725-075f92f19468-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl\" (UID: \"a947f5ca-ee30-454a-8725-075f92f19468\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:41:45.612245 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.612226 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a947f5ca-ee30-454a-8725-075f92f19468-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl\" (UID: \"a947f5ca-ee30-454a-8725-075f92f19468\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:41:45.617495 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.617476 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb5vh\" (UniqueName: \"kubernetes.io/projected/a947f5ca-ee30-454a-8725-075f92f19468-kube-api-access-jb5vh\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl\" (UID: \"a947f5ca-ee30-454a-8725-075f92f19468\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:41:45.669156 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.669134 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:41:45.792720 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:45.792646 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl"] Apr 17 16:41:45.795059 ip-10-0-138-170 kubenswrapper[2578]: W0417 16:41:45.795028 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda947f5ca_ee30_454a_8725_075f92f19468.slice/crio-b9513ddf44967f29e9f6682802f5b7252f1c82bf5e2969c4be60c1c29593b84a WatchSource:0}: Error finding container b9513ddf44967f29e9f6682802f5b7252f1c82bf5e2969c4be60c1c29593b84a: Status 404 returned error can't find the container with id b9513ddf44967f29e9f6682802f5b7252f1c82bf5e2969c4be60c1c29593b84a Apr 17 16:41:46.153568 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:46.153529 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" event={"ID":"a947f5ca-ee30-454a-8725-075f92f19468","Type":"ContainerStarted","Data":"b9513ddf44967f29e9f6682802f5b7252f1c82bf5e2969c4be60c1c29593b84a"} Apr 17 16:41:49.165140 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:49.165100 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" event={"ID":"a947f5ca-ee30-454a-8725-075f92f19468","Type":"ContainerStarted","Data":"c66426443fe793d6b30b6670bca35a0d7dddbf4cdcf3df70ce78f87fabcf9098"} Apr 17 16:41:53.177130 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:53.177043 2578 generic.go:358] "Generic (PLEG): container finished" podID="a947f5ca-ee30-454a-8725-075f92f19468" containerID="c66426443fe793d6b30b6670bca35a0d7dddbf4cdcf3df70ce78f87fabcf9098" exitCode=0 Apr 17 16:41:53.177130 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:41:53.177106 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" event={"ID":"a947f5ca-ee30-454a-8725-075f92f19468","Type":"ContainerDied","Data":"c66426443fe793d6b30b6670bca35a0d7dddbf4cdcf3df70ce78f87fabcf9098"} Apr 17 16:42:06.222256 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:42:06.222204 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" event={"ID":"a947f5ca-ee30-454a-8725-075f92f19468","Type":"ContainerStarted","Data":"9918c6f2f85f6951af985699815ec147301383363fed6eb0c143bd29bf50e08e"} Apr 17 16:42:08.232172 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:42:08.232079 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" event={"ID":"a947f5ca-ee30-454a-8725-075f92f19468","Type":"ContainerStarted","Data":"c80a9c50776ba980dabc3886e7c3b10f1abb26e5e437d7346cb4053f13c43bc2"} Apr 17 16:42:08.232540 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:42:08.232199 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:42:08.249200 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:42:08.249157 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" podStartSLOduration=1.249091682 podStartE2EDuration="23.249144647s" podCreationTimestamp="2026-04-17 16:41:45 +0000 UTC" firstStartedPulling="2026-04-17 16:41:45.796884735 +0000 UTC m=+617.318461666" lastFinishedPulling="2026-04-17 16:42:07.7969377 +0000 UTC m=+639.318514631" observedRunningTime="2026-04-17 16:42:08.247605216 +0000 UTC m=+639.769182165" watchObservedRunningTime="2026-04-17 16:42:08.249144647 +0000 UTC m=+639.770721597" Apr 17 16:42:09.235082 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:42:09.235041 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:42:09.236286 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:42:09.236246 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:42:10.238135 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:42:10.238095 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:42:15.242280 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:42:15.242254 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:42:15.242860 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:42:15.242833 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:42:25.242763 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:42:25.242725 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:42:35.243055 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:42:35.243012 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:42:45.242751 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:42:45.242707 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:42:55.243449 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:42:55.243404 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:43:05.243516 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:05.243479 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:43:15.244015 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:15.243945 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:43:54.344613 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:54.344572 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl"] Apr 17 16:43:54.345217 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:54.344905 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kserve-container" containerID="cri-o://9918c6f2f85f6951af985699815ec147301383363fed6eb0c143bd29bf50e08e" gracePeriod=30 Apr 17 16:43:54.345217 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:54.344983 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kube-rbac-proxy" containerID="cri-o://c80a9c50776ba980dabc3886e7c3b10f1abb26e5e437d7346cb4053f13c43bc2" gracePeriod=30 Apr 17 16:43:54.545153 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:54.545122 2578 generic.go:358] "Generic (PLEG): container finished" podID="a947f5ca-ee30-454a-8725-075f92f19468" containerID="c80a9c50776ba980dabc3886e7c3b10f1abb26e5e437d7346cb4053f13c43bc2" exitCode=2 Apr 17 16:43:54.545300 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:54.545172 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" event={"ID":"a947f5ca-ee30-454a-8725-075f92f19468","Type":"ContainerDied","Data":"c80a9c50776ba980dabc3886e7c3b10f1abb26e5e437d7346cb4053f13c43bc2"} Apr 17 16:43:55.238896 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:55.238857 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 17 16:43:55.242927 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:55.242902 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:43:58.087790 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.087766 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:43:58.267523 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.267450 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb5vh\" (UniqueName: \"kubernetes.io/projected/a947f5ca-ee30-454a-8725-075f92f19468-kube-api-access-jb5vh\") pod \"a947f5ca-ee30-454a-8725-075f92f19468\" (UID: \"a947f5ca-ee30-454a-8725-075f92f19468\") " Apr 17 16:43:58.267523 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.267492 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a947f5ca-ee30-454a-8725-075f92f19468-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"a947f5ca-ee30-454a-8725-075f92f19468\" (UID: \"a947f5ca-ee30-454a-8725-075f92f19468\") " Apr 17 16:43:58.267751 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.267530 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a947f5ca-ee30-454a-8725-075f92f19468-kserve-provision-location\") pod \"a947f5ca-ee30-454a-8725-075f92f19468\" (UID: \"a947f5ca-ee30-454a-8725-075f92f19468\") " Apr 17 16:43:58.267751 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.267561 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a947f5ca-ee30-454a-8725-075f92f19468-proxy-tls\") pod \"a947f5ca-ee30-454a-8725-075f92f19468\" (UID: \"a947f5ca-ee30-454a-8725-075f92f19468\") " Apr 17 16:43:58.267926 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.267895 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a947f5ca-ee30-454a-8725-075f92f19468-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a947f5ca-ee30-454a-8725-075f92f19468" (UID: "a947f5ca-ee30-454a-8725-075f92f19468"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:43:58.267926 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.267915 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a947f5ca-ee30-454a-8725-075f92f19468-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config") pod "a947f5ca-ee30-454a-8725-075f92f19468" (UID: "a947f5ca-ee30-454a-8725-075f92f19468"). InnerVolumeSpecName "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:43:58.269844 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.269818 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a947f5ca-ee30-454a-8725-075f92f19468-kube-api-access-jb5vh" (OuterVolumeSpecName: "kube-api-access-jb5vh") pod "a947f5ca-ee30-454a-8725-075f92f19468" (UID: "a947f5ca-ee30-454a-8725-075f92f19468"). InnerVolumeSpecName "kube-api-access-jb5vh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:43:58.269844 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.269831 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a947f5ca-ee30-454a-8725-075f92f19468-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a947f5ca-ee30-454a-8725-075f92f19468" (UID: "a947f5ca-ee30-454a-8725-075f92f19468"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:43:58.368458 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.368433 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a947f5ca-ee30-454a-8725-075f92f19468-kserve-provision-location\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:43:58.368458 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.368457 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a947f5ca-ee30-454a-8725-075f92f19468-proxy-tls\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:43:58.368599 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.368468 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jb5vh\" (UniqueName: \"kubernetes.io/projected/a947f5ca-ee30-454a-8725-075f92f19468-kube-api-access-jb5vh\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:43:58.368599 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.368478 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a947f5ca-ee30-454a-8725-075f92f19468-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 16:43:58.557468 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.557440 2578 generic.go:358] "Generic (PLEG): container finished" podID="a947f5ca-ee30-454a-8725-075f92f19468" containerID="9918c6f2f85f6951af985699815ec147301383363fed6eb0c143bd29bf50e08e" exitCode=0 Apr 17 16:43:58.557600 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.557520 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" Apr 17 16:43:58.557600 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.557536 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" event={"ID":"a947f5ca-ee30-454a-8725-075f92f19468","Type":"ContainerDied","Data":"9918c6f2f85f6951af985699815ec147301383363fed6eb0c143bd29bf50e08e"} Apr 17 16:43:58.557600 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.557578 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl" event={"ID":"a947f5ca-ee30-454a-8725-075f92f19468","Type":"ContainerDied","Data":"b9513ddf44967f29e9f6682802f5b7252f1c82bf5e2969c4be60c1c29593b84a"} Apr 17 16:43:58.557600 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.557595 2578 scope.go:117] "RemoveContainer" containerID="c80a9c50776ba980dabc3886e7c3b10f1abb26e5e437d7346cb4053f13c43bc2" Apr 17 16:43:58.565751 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.565737 2578 scope.go:117] "RemoveContainer" containerID="9918c6f2f85f6951af985699815ec147301383363fed6eb0c143bd29bf50e08e" Apr 17 16:43:58.572771 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.572753 2578 scope.go:117] "RemoveContainer" containerID="c66426443fe793d6b30b6670bca35a0d7dddbf4cdcf3df70ce78f87fabcf9098" Apr 17 16:43:58.579109 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.579090 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl"] Apr 17 16:43:58.580471 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.580460 2578 scope.go:117] "RemoveContainer" containerID="c80a9c50776ba980dabc3886e7c3b10f1abb26e5e437d7346cb4053f13c43bc2" Apr 17 16:43:58.580721 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:43:58.580700 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c80a9c50776ba980dabc3886e7c3b10f1abb26e5e437d7346cb4053f13c43bc2\": container with ID starting with c80a9c50776ba980dabc3886e7c3b10f1abb26e5e437d7346cb4053f13c43bc2 not found: ID does not exist" containerID="c80a9c50776ba980dabc3886e7c3b10f1abb26e5e437d7346cb4053f13c43bc2" Apr 17 16:43:58.580828 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.580728 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80a9c50776ba980dabc3886e7c3b10f1abb26e5e437d7346cb4053f13c43bc2"} err="failed to get container status \"c80a9c50776ba980dabc3886e7c3b10f1abb26e5e437d7346cb4053f13c43bc2\": rpc error: code = NotFound desc = could not find container \"c80a9c50776ba980dabc3886e7c3b10f1abb26e5e437d7346cb4053f13c43bc2\": container with ID starting with c80a9c50776ba980dabc3886e7c3b10f1abb26e5e437d7346cb4053f13c43bc2 not found: ID does not exist" Apr 17 16:43:58.580828 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.580745 2578 scope.go:117] "RemoveContainer" containerID="9918c6f2f85f6951af985699815ec147301383363fed6eb0c143bd29bf50e08e" Apr 17 16:43:58.581014 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:43:58.580997 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9918c6f2f85f6951af985699815ec147301383363fed6eb0c143bd29bf50e08e\": container with ID starting with 9918c6f2f85f6951af985699815ec147301383363fed6eb0c143bd29bf50e08e not found: ID does not exist" containerID="9918c6f2f85f6951af985699815ec147301383363fed6eb0c143bd29bf50e08e" Apr 17 16:43:58.581113 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.581023 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9918c6f2f85f6951af985699815ec147301383363fed6eb0c143bd29bf50e08e"} err="failed to get container status \"9918c6f2f85f6951af985699815ec147301383363fed6eb0c143bd29bf50e08e\": rpc error: code = NotFound desc = could not find container \"9918c6f2f85f6951af985699815ec147301383363fed6eb0c143bd29bf50e08e\": container with ID starting with 9918c6f2f85f6951af985699815ec147301383363fed6eb0c143bd29bf50e08e not found: ID does not exist" Apr 17 16:43:58.581113 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.581039 2578 scope.go:117] "RemoveContainer" containerID="c66426443fe793d6b30b6670bca35a0d7dddbf4cdcf3df70ce78f87fabcf9098" Apr 17 16:43:58.581381 ip-10-0-138-170 kubenswrapper[2578]: E0417 16:43:58.581356 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c66426443fe793d6b30b6670bca35a0d7dddbf4cdcf3df70ce78f87fabcf9098\": container with ID starting with c66426443fe793d6b30b6670bca35a0d7dddbf4cdcf3df70ce78f87fabcf9098 not found: ID does not exist" containerID="c66426443fe793d6b30b6670bca35a0d7dddbf4cdcf3df70ce78f87fabcf9098" Apr 17 16:43:58.581474 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.581387 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66426443fe793d6b30b6670bca35a0d7dddbf4cdcf3df70ce78f87fabcf9098"} err="failed to get container status \"c66426443fe793d6b30b6670bca35a0d7dddbf4cdcf3df70ce78f87fabcf9098\": rpc error: code = NotFound desc = could not find container \"c66426443fe793d6b30b6670bca35a0d7dddbf4cdcf3df70ce78f87fabcf9098\": container with ID starting with c66426443fe793d6b30b6670bca35a0d7dddbf4cdcf3df70ce78f87fabcf9098 not found: ID does not exist" Apr 17 16:43:58.582883 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:58.582867 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl"] Apr 17 16:43:59.170713 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:43:59.170686 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a947f5ca-ee30-454a-8725-075f92f19468" path="/var/lib/kubelet/pods/a947f5ca-ee30-454a-8725-075f92f19468/volumes" Apr 17 16:46:30.010362 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:46:30.010333 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 16:46:30.012796 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:46:30.012774 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 16:46:30.014460 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:46:30.014438 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 16:46:30.016545 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:46:30.016526 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 16:51:30.034347 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:51:30.034314 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 16:51:30.037136 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:51:30.037118 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 16:51:30.038503 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:51:30.038481 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 16:51:30.041286 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:51:30.041268 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 16:56:30.057692 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:56:30.057665 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 16:56:30.060362 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:56:30.060344 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 16:56:30.061860 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:56:30.061841 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 16:56:30.064136 ip-10-0-138-170 kubenswrapper[2578]: I0417 16:56:30.064118 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 17:01:30.080874 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:01:30.080759 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 17:01:30.084107 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:01:30.083649 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 17:01:30.084960 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:01:30.084937 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 17:01:30.087447 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:01:30.087431 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 17:06:30.106132 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:06:30.106006 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 17:06:30.110151 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:06:30.106983 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 17:06:30.110151 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:06:30.110137 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 17:06:30.111211 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:06:30.111190 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 17:11:30.129045 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:11:30.128936 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 17:11:30.132108 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:11:30.130815 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 17:11:30.132833 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:11:30.132818 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 17:11:30.134891 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:11:30.134871 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 17:16:30.151938 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:16:30.151815 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 17:16:30.155996 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:16:30.154690 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 17:16:30.156349 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:16:30.156331 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 17:16:30.158924 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:16:30.158905 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 17:21:15.021705 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.021614 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g4mgj/must-gather-l6vnw"] Apr 17 17:21:15.022182 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.021953 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kube-rbac-proxy" Apr 17 17:21:15.022182 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.021965 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kube-rbac-proxy" Apr 17 17:21:15.022182 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.021978 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kserve-container" Apr 17 17:21:15.022182 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.021984 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kserve-container" Apr 17 17:21:15.022182 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.021992 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="storage-initializer" Apr 17 17:21:15.022182 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.021998 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="storage-initializer" Apr 17 17:21:15.022182 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.022048 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kserve-container" Apr 17 17:21:15.022182 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.022057 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a947f5ca-ee30-454a-8725-075f92f19468" containerName="kube-rbac-proxy" Apr 17 17:21:15.025099 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.025059 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4mgj/must-gather-l6vnw" Apr 17 17:21:15.027879 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.027860 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g4mgj\"/\"kube-root-ca.crt\"" Apr 17 17:21:15.028775 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.028755 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-g4mgj\"/\"default-dockercfg-nrwf9\"" Apr 17 17:21:15.028858 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.028790 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g4mgj\"/\"openshift-service-ca.crt\"" Apr 17 17:21:15.043776 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.043750 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g4mgj/must-gather-l6vnw"] Apr 17 17:21:15.086599 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.086570 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54434768-8ae8-4866-a2b6-6b1623f0adcb-must-gather-output\") pod \"must-gather-l6vnw\" (UID: \"54434768-8ae8-4866-a2b6-6b1623f0adcb\") " pod="openshift-must-gather-g4mgj/must-gather-l6vnw" Apr 17 17:21:15.086693 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.086606 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks2xl\" (UniqueName: \"kubernetes.io/projected/54434768-8ae8-4866-a2b6-6b1623f0adcb-kube-api-access-ks2xl\") pod \"must-gather-l6vnw\" (UID: \"54434768-8ae8-4866-a2b6-6b1623f0adcb\") " pod="openshift-must-gather-g4mgj/must-gather-l6vnw" Apr 17 17:21:15.187747 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.187716 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54434768-8ae8-4866-a2b6-6b1623f0adcb-must-gather-output\") pod \"must-gather-l6vnw\" (UID: \"54434768-8ae8-4866-a2b6-6b1623f0adcb\") " pod="openshift-must-gather-g4mgj/must-gather-l6vnw" Apr 17 17:21:15.187873 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.187755 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ks2xl\" (UniqueName: \"kubernetes.io/projected/54434768-8ae8-4866-a2b6-6b1623f0adcb-kube-api-access-ks2xl\") pod \"must-gather-l6vnw\" (UID: \"54434768-8ae8-4866-a2b6-6b1623f0adcb\") " pod="openshift-must-gather-g4mgj/must-gather-l6vnw" Apr 17 17:21:15.188043 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.188021 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54434768-8ae8-4866-a2b6-6b1623f0adcb-must-gather-output\") pod \"must-gather-l6vnw\" (UID: \"54434768-8ae8-4866-a2b6-6b1623f0adcb\") " pod="openshift-must-gather-g4mgj/must-gather-l6vnw" Apr 17 17:21:15.196575 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.196553 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks2xl\" (UniqueName: \"kubernetes.io/projected/54434768-8ae8-4866-a2b6-6b1623f0adcb-kube-api-access-ks2xl\") pod \"must-gather-l6vnw\" (UID: \"54434768-8ae8-4866-a2b6-6b1623f0adcb\") " pod="openshift-must-gather-g4mgj/must-gather-l6vnw" Apr 17 17:21:15.349460 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.349432 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4mgj/must-gather-l6vnw" Apr 17 17:21:15.478418 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.478390 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g4mgj/must-gather-l6vnw"] Apr 17 17:21:15.481681 ip-10-0-138-170 kubenswrapper[2578]: W0417 17:21:15.481652 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54434768_8ae8_4866_a2b6_6b1623f0adcb.slice/crio-597b0c7db118daa1c0e8e30047a1b45bff9f2c1c071401dae3caea9986fafac0 WatchSource:0}: Error finding container 597b0c7db118daa1c0e8e30047a1b45bff9f2c1c071401dae3caea9986fafac0: Status 404 returned error can't find the container with id 597b0c7db118daa1c0e8e30047a1b45bff9f2c1c071401dae3caea9986fafac0 Apr 17 17:21:15.483375 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:15.483357 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:21:16.139823 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:16.139781 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4mgj/must-gather-l6vnw" event={"ID":"54434768-8ae8-4866-a2b6-6b1623f0adcb","Type":"ContainerStarted","Data":"597b0c7db118daa1c0e8e30047a1b45bff9f2c1c071401dae3caea9986fafac0"} Apr 17 17:21:20.155556 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:20.155508 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4mgj/must-gather-l6vnw" event={"ID":"54434768-8ae8-4866-a2b6-6b1623f0adcb","Type":"ContainerStarted","Data":"fe7779c24edeaa6ac1097d9635863482d78635483e1ea970682b65ce8e3e1505"} Apr 17 17:21:20.155556 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:20.155559 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4mgj/must-gather-l6vnw" event={"ID":"54434768-8ae8-4866-a2b6-6b1623f0adcb","Type":"ContainerStarted","Data":"73763086d94c36fd6d39fe30210a89545504b4aa88f9e78903fdf0c591a3b53f"} Apr 17 17:21:20.172422 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:20.172372 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g4mgj/must-gather-l6vnw" podStartSLOduration=1.1686370209999999 podStartE2EDuration="5.172357405s" podCreationTimestamp="2026-04-17 17:21:15 +0000 UTC" firstStartedPulling="2026-04-17 17:21:15.483487471 +0000 UTC m=+2987.005064403" lastFinishedPulling="2026-04-17 17:21:19.487207859 +0000 UTC m=+2991.008784787" observedRunningTime="2026-04-17 17:21:20.171349655 +0000 UTC m=+2991.692926615" watchObservedRunningTime="2026-04-17 17:21:20.172357405 +0000 UTC m=+2991.693934355" Apr 17 17:21:30.178835 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:30.178728 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 17:21:30.184526 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:30.182655 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 17:21:30.184526 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:30.183023 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 17:21:30.186807 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:30.186787 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 17:21:38.216097 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:38.216050 2578 generic.go:358] "Generic (PLEG): container finished" podID="54434768-8ae8-4866-a2b6-6b1623f0adcb" containerID="73763086d94c36fd6d39fe30210a89545504b4aa88f9e78903fdf0c591a3b53f" exitCode=0 Apr 17 17:21:38.216479 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:38.216123 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4mgj/must-gather-l6vnw" event={"ID":"54434768-8ae8-4866-a2b6-6b1623f0adcb","Type":"ContainerDied","Data":"73763086d94c36fd6d39fe30210a89545504b4aa88f9e78903fdf0c591a3b53f"} Apr 17 17:21:38.216479 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:38.216431 2578 scope.go:117] "RemoveContainer" containerID="73763086d94c36fd6d39fe30210a89545504b4aa88f9e78903fdf0c591a3b53f" Apr 17 17:21:39.169774 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:39.169749 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g4mgj_must-gather-l6vnw_54434768-8ae8-4866-a2b6-6b1623f0adcb/gather/0.log" Apr 17 17:21:42.697329 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:42.697294 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-lh87q_7a2aba92-9d01-443d-82e4-0f79cdef282b/global-pull-secret-syncer/0.log" Apr 17 17:21:42.868800 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:42.868769 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vpndd_30498c9f-32f4-458b-914f-a3fc1f718376/konnectivity-agent/0.log" Apr 17 17:21:42.992993 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:42.992921 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-170.ec2.internal_72d3944a84d00c65c1b4be69187354b2/haproxy/0.log" Apr 17 17:21:44.617525 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:44.617490 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-g4mgj/must-gather-l6vnw"] Apr 17 17:21:44.617995 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:44.617708 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-g4mgj/must-gather-l6vnw" podUID="54434768-8ae8-4866-a2b6-6b1623f0adcb" containerName="copy" containerID="cri-o://fe7779c24edeaa6ac1097d9635863482d78635483e1ea970682b65ce8e3e1505" gracePeriod=2 Apr 17 17:21:44.623910 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:44.623886 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-g4mgj/must-gather-l6vnw"] Apr 17 17:21:44.843735 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:44.843707 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g4mgj_must-gather-l6vnw_54434768-8ae8-4866-a2b6-6b1623f0adcb/copy/0.log" Apr 17 17:21:44.844052 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:44.844035 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4mgj/must-gather-l6vnw" Apr 17 17:21:44.846058 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:44.846032 2578 status_manager.go:895] "Failed to get status for pod" podUID="54434768-8ae8-4866-a2b6-6b1623f0adcb" pod="openshift-must-gather-g4mgj/must-gather-l6vnw" err="pods \"must-gather-l6vnw\" is forbidden: User \"system:node:ip-10-0-138-170.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-g4mgj\": no relationship found between node 'ip-10-0-138-170.ec2.internal' and this object" Apr 17 17:21:44.954714 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:44.954648 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54434768-8ae8-4866-a2b6-6b1623f0adcb-must-gather-output\") pod \"54434768-8ae8-4866-a2b6-6b1623f0adcb\" (UID: \"54434768-8ae8-4866-a2b6-6b1623f0adcb\") " Apr 17 17:21:44.954824 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:44.954717 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks2xl\" (UniqueName: \"kubernetes.io/projected/54434768-8ae8-4866-a2b6-6b1623f0adcb-kube-api-access-ks2xl\") pod \"54434768-8ae8-4866-a2b6-6b1623f0adcb\" (UID: \"54434768-8ae8-4866-a2b6-6b1623f0adcb\") " Apr 17 17:21:44.956187 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:44.956163 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54434768-8ae8-4866-a2b6-6b1623f0adcb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "54434768-8ae8-4866-a2b6-6b1623f0adcb" (UID: "54434768-8ae8-4866-a2b6-6b1623f0adcb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:21:44.957105 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:44.957084 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54434768-8ae8-4866-a2b6-6b1623f0adcb-kube-api-access-ks2xl" (OuterVolumeSpecName: "kube-api-access-ks2xl") pod "54434768-8ae8-4866-a2b6-6b1623f0adcb" (UID: "54434768-8ae8-4866-a2b6-6b1623f0adcb"). InnerVolumeSpecName "kube-api-access-ks2xl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:21:45.055444 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:45.055411 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ks2xl\" (UniqueName: \"kubernetes.io/projected/54434768-8ae8-4866-a2b6-6b1623f0adcb-kube-api-access-ks2xl\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 17:21:45.055444 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:45.055438 2578 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54434768-8ae8-4866-a2b6-6b1623f0adcb-must-gather-output\") on node \"ip-10-0-138-170.ec2.internal\" DevicePath \"\"" Apr 17 17:21:45.170936 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:45.170909 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54434768-8ae8-4866-a2b6-6b1623f0adcb" path="/var/lib/kubelet/pods/54434768-8ae8-4866-a2b6-6b1623f0adcb/volumes" Apr 17 17:21:45.239050 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:45.239001 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g4mgj_must-gather-l6vnw_54434768-8ae8-4866-a2b6-6b1623f0adcb/copy/0.log" Apr 17 17:21:45.239338 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:45.239314 2578 generic.go:358] "Generic (PLEG): container finished" podID="54434768-8ae8-4866-a2b6-6b1623f0adcb" containerID="fe7779c24edeaa6ac1097d9635863482d78635483e1ea970682b65ce8e3e1505" exitCode=143 Apr 17 17:21:45.239479 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:45.239370 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4mgj/must-gather-l6vnw" Apr 17 17:21:45.239479 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:45.239398 2578 scope.go:117] "RemoveContainer" containerID="fe7779c24edeaa6ac1097d9635863482d78635483e1ea970682b65ce8e3e1505" Apr 17 17:21:45.247025 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:45.247007 2578 scope.go:117] "RemoveContainer" containerID="73763086d94c36fd6d39fe30210a89545504b4aa88f9e78903fdf0c591a3b53f" Apr 17 17:21:45.259270 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:45.259250 2578 scope.go:117] "RemoveContainer" containerID="fe7779c24edeaa6ac1097d9635863482d78635483e1ea970682b65ce8e3e1505" Apr 17 17:21:45.259550 ip-10-0-138-170 kubenswrapper[2578]: E0417 17:21:45.259529 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe7779c24edeaa6ac1097d9635863482d78635483e1ea970682b65ce8e3e1505\": container with ID starting with fe7779c24edeaa6ac1097d9635863482d78635483e1ea970682b65ce8e3e1505 not found: ID does not exist" containerID="fe7779c24edeaa6ac1097d9635863482d78635483e1ea970682b65ce8e3e1505" Apr 17 17:21:45.259600 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:45.259559 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe7779c24edeaa6ac1097d9635863482d78635483e1ea970682b65ce8e3e1505"} err="failed to get container status \"fe7779c24edeaa6ac1097d9635863482d78635483e1ea970682b65ce8e3e1505\": rpc error: code = NotFound desc = could not find container \"fe7779c24edeaa6ac1097d9635863482d78635483e1ea970682b65ce8e3e1505\": container with ID starting with fe7779c24edeaa6ac1097d9635863482d78635483e1ea970682b65ce8e3e1505 not found: ID does not exist" Apr 17 17:21:45.259600 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:45.259578 2578 scope.go:117] "RemoveContainer" containerID="73763086d94c36fd6d39fe30210a89545504b4aa88f9e78903fdf0c591a3b53f" Apr 17 17:21:45.259834 ip-10-0-138-170 kubenswrapper[2578]: E0417 17:21:45.259815 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73763086d94c36fd6d39fe30210a89545504b4aa88f9e78903fdf0c591a3b53f\": container with ID starting with 73763086d94c36fd6d39fe30210a89545504b4aa88f9e78903fdf0c591a3b53f not found: ID does not exist" containerID="73763086d94c36fd6d39fe30210a89545504b4aa88f9e78903fdf0c591a3b53f" Apr 17 17:21:45.259891 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:45.259843 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73763086d94c36fd6d39fe30210a89545504b4aa88f9e78903fdf0c591a3b53f"} err="failed to get container status \"73763086d94c36fd6d39fe30210a89545504b4aa88f9e78903fdf0c591a3b53f\": rpc error: code = NotFound desc = could not find container \"73763086d94c36fd6d39fe30210a89545504b4aa88f9e78903fdf0c591a3b53f\": container with ID starting with 73763086d94c36fd6d39fe30210a89545504b4aa88f9e78903fdf0c591a3b53f not found: ID does not exist" Apr 17 17:21:46.178996 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:46.178967 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a128d97c-5289-4b89-9e74-6c42982f3eba/alertmanager/0.log" Apr 17 17:21:46.208383 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:46.208357 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a128d97c-5289-4b89-9e74-6c42982f3eba/config-reloader/0.log" Apr 17 17:21:46.242800 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:46.242761 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a128d97c-5289-4b89-9e74-6c42982f3eba/kube-rbac-proxy-web/0.log" Apr 17 17:21:46.273971 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:46.273953 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a128d97c-5289-4b89-9e74-6c42982f3eba/kube-rbac-proxy/0.log" Apr 17 17:21:46.299201 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:46.299176 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a128d97c-5289-4b89-9e74-6c42982f3eba/kube-rbac-proxy-metric/0.log" Apr 17 17:21:46.325508 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:46.325486 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a128d97c-5289-4b89-9e74-6c42982f3eba/prom-label-proxy/0.log" Apr 17 17:21:46.356123 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:46.356095 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a128d97c-5289-4b89-9e74-6c42982f3eba/init-config-reloader/0.log" Apr 17 17:21:46.680379 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:46.680359 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qjzz4_1731b992-77af-4172-8c09-e0f9502982e1/node-exporter/0.log" Apr 17 17:21:46.712372 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:46.712348 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qjzz4_1731b992-77af-4172-8c09-e0f9502982e1/kube-rbac-proxy/0.log" Apr 17 17:21:46.737766 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:46.737745 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qjzz4_1731b992-77af-4172-8c09-e0f9502982e1/init-textfile/0.log" Apr 17 17:21:46.953204 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:46.953148 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d/prometheus/0.log" Apr 17 17:21:46.973550 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:46.973532 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d/config-reloader/0.log" Apr 17 17:21:46.998176 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:46.998154 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d/thanos-sidecar/0.log" Apr 17 17:21:47.024602 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:47.024581 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d/kube-rbac-proxy-web/0.log" Apr 17 17:21:47.053070 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:47.053053 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d/kube-rbac-proxy/0.log" Apr 17 17:21:47.084791 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:47.084765 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d/kube-rbac-proxy-thanos/0.log" Apr 17 17:21:47.111848 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:47.111826 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1f7ecf39-be02-4bdd-8d5f-bed43dd9c24d/init-config-reloader/0.log" Apr 17 17:21:47.143232 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:47.143214 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jz2ql_e90eeafd-bb31-4b7a-a3ec-9103e9a76283/prometheus-operator/0.log" Apr 17 17:21:47.169704 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:47.169688 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jz2ql_e90eeafd-bb31-4b7a-a3ec-9103e9a76283/kube-rbac-proxy/0.log" Apr 17 17:21:47.195854 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:47.195835 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-r2s7q_c9a73140-1bc2-42f0-be27-699cd3ace384/prometheus-operator-admission-webhook/0.log" Apr 17 17:21:47.319353 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:47.319330 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67546d9545-sppsg_d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0/thanos-query/0.log" Apr 17 17:21:47.347326 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:47.347289 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67546d9545-sppsg_d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0/kube-rbac-proxy-web/0.log" Apr 17 17:21:47.373967 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:47.373942 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67546d9545-sppsg_d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0/kube-rbac-proxy/0.log" Apr 17 17:21:47.397534 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:47.397513 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67546d9545-sppsg_d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0/prom-label-proxy/0.log" Apr 17 17:21:47.425984 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:47.425965 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67546d9545-sppsg_d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0/kube-rbac-proxy-rules/0.log" Apr 17 17:21:47.457728 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:47.457709 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67546d9545-sppsg_d0e8f0d0-9cf7-484f-8943-5638ec9dfcc0/kube-rbac-proxy-metrics/0.log" Apr 17 17:21:48.539528 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:48.539421 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-dcvbr_70703068-0a34-4ea5-8d18-b0d1a8b73858/networking-console-plugin/0.log" Apr 17 17:21:48.975579 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:48.975545 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/2.log" Apr 17 17:21:48.978808 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:48.978791 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t7k46_8989b18c-2718-4e13-895b-5944e510a981/console-operator/3.log" Apr 17 17:21:49.536131 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.536090 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f"] Apr 17 17:21:49.536492 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.536475 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54434768-8ae8-4866-a2b6-6b1623f0adcb" containerName="gather" Apr 17 17:21:49.536577 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.536495 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="54434768-8ae8-4866-a2b6-6b1623f0adcb" containerName="gather" Apr 17 17:21:49.536577 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.536510 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54434768-8ae8-4866-a2b6-6b1623f0adcb" containerName="copy" Apr 17 17:21:49.536577 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.536517 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="54434768-8ae8-4866-a2b6-6b1623f0adcb" containerName="copy" Apr 17 17:21:49.536725 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.536626 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="54434768-8ae8-4866-a2b6-6b1623f0adcb" containerName="copy" Apr 17 17:21:49.536725 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.536645 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="54434768-8ae8-4866-a2b6-6b1623f0adcb" containerName="gather" Apr 17 17:21:49.541628 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.541607 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:49.544008 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.543987 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmwvb\"/\"kube-root-ca.crt\"" Apr 17 17:21:49.544126 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.544111 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tmwvb\"/\"default-dockercfg-dkhbl\"" Apr 17 17:21:49.544879 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.544856 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmwvb\"/\"openshift-service-ca.crt\"" Apr 17 17:21:49.551473 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.551454 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f"] Apr 17 17:21:49.693270 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.693242 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f806675-c5a3-45ab-ade3-de3000c9eb2e-sys\") pod \"perf-node-gather-daemonset-bgz9f\" (UID: \"3f806675-c5a3-45ab-ade3-de3000c9eb2e\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:49.693401 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.693279 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgt4m\" (UniqueName: \"kubernetes.io/projected/3f806675-c5a3-45ab-ade3-de3000c9eb2e-kube-api-access-hgt4m\") pod \"perf-node-gather-daemonset-bgz9f\" (UID: \"3f806675-c5a3-45ab-ade3-de3000c9eb2e\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:49.693401 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.693339 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3f806675-c5a3-45ab-ade3-de3000c9eb2e-proc\") pod \"perf-node-gather-daemonset-bgz9f\" (UID: \"3f806675-c5a3-45ab-ade3-de3000c9eb2e\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:49.693401 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.693365 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f806675-c5a3-45ab-ade3-de3000c9eb2e-lib-modules\") pod \"perf-node-gather-daemonset-bgz9f\" (UID: \"3f806675-c5a3-45ab-ade3-de3000c9eb2e\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:49.693401 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.693389 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3f806675-c5a3-45ab-ade3-de3000c9eb2e-podres\") pod \"perf-node-gather-daemonset-bgz9f\" (UID: \"3f806675-c5a3-45ab-ade3-de3000c9eb2e\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:49.794169 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.794143 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f806675-c5a3-45ab-ade3-de3000c9eb2e-sys\") pod \"perf-node-gather-daemonset-bgz9f\" (UID: \"3f806675-c5a3-45ab-ade3-de3000c9eb2e\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:49.794307 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.794177 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgt4m\" (UniqueName: \"kubernetes.io/projected/3f806675-c5a3-45ab-ade3-de3000c9eb2e-kube-api-access-hgt4m\") pod \"perf-node-gather-daemonset-bgz9f\" (UID: \"3f806675-c5a3-45ab-ade3-de3000c9eb2e\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:49.794307 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.794221 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3f806675-c5a3-45ab-ade3-de3000c9eb2e-proc\") pod \"perf-node-gather-daemonset-bgz9f\" (UID: \"3f806675-c5a3-45ab-ade3-de3000c9eb2e\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:49.794307 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.794272 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f806675-c5a3-45ab-ade3-de3000c9eb2e-sys\") pod \"perf-node-gather-daemonset-bgz9f\" (UID: \"3f806675-c5a3-45ab-ade3-de3000c9eb2e\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:49.794307 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.794279 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3f806675-c5a3-45ab-ade3-de3000c9eb2e-proc\") pod \"perf-node-gather-daemonset-bgz9f\" (UID: \"3f806675-c5a3-45ab-ade3-de3000c9eb2e\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:49.794507 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.794374 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f806675-c5a3-45ab-ade3-de3000c9eb2e-lib-modules\") pod \"perf-node-gather-daemonset-bgz9f\" (UID: \"3f806675-c5a3-45ab-ade3-de3000c9eb2e\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:49.794507 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.794404 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3f806675-c5a3-45ab-ade3-de3000c9eb2e-podres\") pod \"perf-node-gather-daemonset-bgz9f\" (UID: \"3f806675-c5a3-45ab-ade3-de3000c9eb2e\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:49.794600 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.794525 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f806675-c5a3-45ab-ade3-de3000c9eb2e-lib-modules\") pod \"perf-node-gather-daemonset-bgz9f\" (UID: \"3f806675-c5a3-45ab-ade3-de3000c9eb2e\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:49.794600 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.794551 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3f806675-c5a3-45ab-ade3-de3000c9eb2e-podres\") pod \"perf-node-gather-daemonset-bgz9f\" (UID: \"3f806675-c5a3-45ab-ade3-de3000c9eb2e\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:49.805191 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.805171 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgt4m\" (UniqueName: \"kubernetes.io/projected/3f806675-c5a3-45ab-ade3-de3000c9eb2e-kube-api-access-hgt4m\") pod \"perf-node-gather-daemonset-bgz9f\" (UID: \"3f806675-c5a3-45ab-ade3-de3000c9eb2e\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:49.851839 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.851817 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:49.976853 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:49.976759 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f"] Apr 17 17:21:49.979497 ip-10-0-138-170 kubenswrapper[2578]: W0417 17:21:49.979468 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3f806675_c5a3_45ab_ade3_de3000c9eb2e.slice/crio-48e47e2c5e6c57bf281064847343c330f7573a556185a8745ce4afa2c4be41bc WatchSource:0}: Error finding container 48e47e2c5e6c57bf281064847343c330f7573a556185a8745ce4afa2c4be41bc: Status 404 returned error can't find the container with id 48e47e2c5e6c57bf281064847343c330f7573a556185a8745ce4afa2c4be41bc Apr 17 17:21:50.255235 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:50.255202 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" event={"ID":"3f806675-c5a3-45ab-ade3-de3000c9eb2e","Type":"ContainerStarted","Data":"2a1e81ebac01b24a3d02d65a49877707d406adb4a63968a1b4379445df216e7d"} Apr 17 17:21:50.255235 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:50.255236 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" event={"ID":"3f806675-c5a3-45ab-ade3-de3000c9eb2e","Type":"ContainerStarted","Data":"48e47e2c5e6c57bf281064847343c330f7573a556185a8745ce4afa2c4be41bc"} Apr 17 17:21:50.255386 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:50.255260 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:50.274107 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:50.274045 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" podStartSLOduration=1.2740321510000001 podStartE2EDuration="1.274032151s" podCreationTimestamp="2026-04-17 17:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:21:50.271810995 +0000 UTC m=+3021.793387957" watchObservedRunningTime="2026-04-17 17:21:50.274032151 +0000 UTC m=+3021.795609101" Apr 17 17:21:50.539607 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:50.539528 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ghkgl_c56755ae-c685-4cd5-a21d-9b2df9f5189f/dns/0.log" Apr 17 17:21:50.566379 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:50.566349 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ghkgl_c56755ae-c685-4cd5-a21d-9b2df9f5189f/kube-rbac-proxy/0.log" Apr 17 17:21:50.702088 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:50.702044 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7l9qg_61ffcc07-b8ef-4fcc-ab95-d8a4d75484df/dns-node-resolver/0.log" Apr 17 17:21:51.231780 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:51.231749 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5ft4z_eb979380-a8c1-43a4-b8ad-f3ba0967a2d7/node-ca/0.log" Apr 17 17:21:52.347181 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:52.347153 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-lfzcd_5eb99a8d-95ed-4e6b-8181-59a683f03f29/serve-healthcheck-canary/0.log" Apr 17 17:21:52.827017 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:52.826992 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-zgh5t_52c76994-eea6-40ad-81ff-21383f7c251b/insights-operator/0.log" Apr 17 17:21:52.828001 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:52.827981 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-zgh5t_52c76994-eea6-40ad-81ff-21383f7c251b/insights-operator/1.log" Apr 17 17:21:52.931465 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:52.931437 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q8sb6_dc0f2f40-0d74-466f-8161-40616ba653a0/kube-rbac-proxy/0.log" Apr 17 17:21:52.957399 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:52.957383 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q8sb6_dc0f2f40-0d74-466f-8161-40616ba653a0/exporter/0.log" Apr 17 17:21:52.981925 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:52.981879 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q8sb6_dc0f2f40-0d74-466f-8161-40616ba653a0/extractor/0.log" Apr 17 17:21:55.153573 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:55.153542 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-97mcb_b6fbe312-3c47-4597-97d9-f46977eba305/manager/0.log" Apr 17 17:21:55.682476 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:55.682450 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-fk4r2_06b869af-d630-4292-899b-5c96d1f04f1c/s3-init/0.log" Apr 17 17:21:56.269543 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:56.269516 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-bgz9f" Apr 17 17:21:59.798160 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:59.798134 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-dpq59_112efc8b-0a44-456a-8159-1b69f0cd48ea/migrator/0.log" Apr 17 17:21:59.819181 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:21:59.819160 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-dpq59_112efc8b-0a44-456a-8159-1b69f0cd48ea/graceful-termination/0.log" Apr 17 17:22:01.440935 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:01.440905 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gdzlp_8039245d-5cc0-42eb-bd46-e84c3ff6d2dd/kube-multus-additional-cni-plugins/0.log" Apr 17 17:22:01.468987 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:01.468963 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gdzlp_8039245d-5cc0-42eb-bd46-e84c3ff6d2dd/egress-router-binary-copy/0.log" Apr 17 17:22:01.491817 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:01.491794 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gdzlp_8039245d-5cc0-42eb-bd46-e84c3ff6d2dd/cni-plugins/0.log" Apr 17 17:22:01.517461 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:01.517442 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gdzlp_8039245d-5cc0-42eb-bd46-e84c3ff6d2dd/bond-cni-plugin/0.log" Apr 17 17:22:01.547109 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:01.547088 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gdzlp_8039245d-5cc0-42eb-bd46-e84c3ff6d2dd/routeoverride-cni/0.log" Apr 17 17:22:01.583492 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:01.583473 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gdzlp_8039245d-5cc0-42eb-bd46-e84c3ff6d2dd/whereabouts-cni-bincopy/0.log" Apr 17 17:22:01.615425 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:01.615406 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gdzlp_8039245d-5cc0-42eb-bd46-e84c3ff6d2dd/whereabouts-cni/0.log" Apr 17 17:22:01.685180 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:01.685155 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lg6kr_f8e1f18a-02d8-4db9-8e72-f140011fc044/kube-multus/0.log" Apr 17 17:22:01.794362 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:01.794336 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-598xw_a6f8630a-c602-4066-a1c1-66f602f947fc/network-metrics-daemon/0.log" Apr 17 17:22:01.818843 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:01.818819 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-598xw_a6f8630a-c602-4066-a1c1-66f602f947fc/kube-rbac-proxy/0.log" Apr 17 17:22:02.690416 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:02.690388 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-controller/0.log" Apr 17 17:22:02.714198 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:02.714169 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/0.log" Apr 17 17:22:02.726472 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:02.726451 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovn-acl-logging/1.log" Apr 17 17:22:02.760616 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:02.760595 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/kube-rbac-proxy-node/0.log" Apr 17 17:22:02.803910 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:02.803892 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 17:22:02.851897 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:02.851876 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/northd/0.log" Apr 17 17:22:02.880275 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:02.880256 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/nbdb/0.log" Apr 17 17:22:02.906255 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:02.906238 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/sbdb/0.log" Apr 17 17:22:02.999886 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:02.999822 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jknk_e9449b84-7aaa-4237-8ea9-618f1fb0c8be/ovnkube-controller/0.log" Apr 17 17:22:04.675621 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:04.675590 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-5z9sm_921f716a-048a-4236-8a81-8bb9b570e437/check-endpoints/0.log" Apr 17 17:22:04.756379 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:04.756355 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-hqwh2_ffde06b8-a22f-482c-89a5-3fa86598f73d/network-check-target-container/0.log" Apr 17 17:22:05.660163 ip-10-0-138-170 kubenswrapper[2578]: I0417 17:22:05.660136 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-f8jht_6d10aa7a-8020-44ad-9772-7262239be5f1/iptables-alerter/0.log"