Apr 21 15:32:48.830271 ip-10-0-132-141 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 15:32:48.830284 ip-10-0-132-141 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 15:32:48.830291 ip-10-0-132-141 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 15:32:48.830492 ip-10-0-132-141 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 15:32:58.832637 ip-10-0-132-141 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 15:32:58.832655 ip-10-0-132-141 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot dc769067efc54a4daa2db3b7539a6281 -- Apr 21 15:35:34.833227 ip-10-0-132-141 systemd[1]: Starting Kubernetes Kubelet... Apr 21 15:35:35.333161 ip-10-0-132-141 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:35.333161 ip-10-0-132-141 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 15:35:35.333161 ip-10-0-132-141 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:35.333161 ip-10-0-132-141 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 15:35:35.333161 ip-10-0-132-141 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:35.336962 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.336859 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 15:35:35.340197 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340182 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:35.340197 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340198 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340202 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340205 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340208 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340211 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340214 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340216 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340219 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340221 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340224 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340226 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340236 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340240 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340243 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340246 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340248 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340251 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340254 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340256 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340260 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:35.340268 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340263 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340265 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340267 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340270 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340273 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340276 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340278 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340281 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340297 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340300 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340302 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340305 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340308 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340310 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340313 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340315 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340317 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340320 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340322 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340325 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:35.340742 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340327 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340330 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340332 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340335 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340337 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340340 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340342 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340345 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340347 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340349 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340352 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340354 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340356 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340359 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340362 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340365 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340368 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340370 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340373 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340376 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:35.341240 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340378 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340382 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340384 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340386 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340389 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340391 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340394 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340396 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340399 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340401 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340404 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340408 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340412 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340415 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340418 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340421 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340423 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340428 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340431 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340434 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:35.341753 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340437 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340439 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340442 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340445 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340447 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340824 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340830 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340833 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340836 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340839 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340842 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340846 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340849 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340851 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340854 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340856 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340860 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340865 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340868 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340871 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:35.342254 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340874 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340877 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340879 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340882 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340884 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340886 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340889 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340891 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340894 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340896 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340898 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340901 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340903 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340906 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340908 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340911 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340913 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340916 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340919 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340921 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:35.342722 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340924 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340927 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340929 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340932 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340934 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340952 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340954 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340957 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340960 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340962 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340965 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340968 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340971 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340973 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340976 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340978 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340981 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340983 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340986 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340989 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:35.343333 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340991 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340994 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340996 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.340999 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341002 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341005 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341007 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341010 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341012 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341015 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341019 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341022 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341024 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341027 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341029 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341032 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341034 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341037 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341039 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341042 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:35.343814 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341044 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341047 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341049 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341052 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341055 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341057 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341060 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341062 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341064 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341067 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.341069 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341808 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341817 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341822 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341827 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341830 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341834 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341838 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341843 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341846 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341849 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 15:35:35.344332 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341852 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341856 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341860 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341863 2579 flags.go:64] FLAG: --cgroup-root="" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341865 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341868 2579 flags.go:64] FLAG: --client-ca-file="" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341871 2579 flags.go:64] FLAG: --cloud-config="" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341874 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341877 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341881 2579 flags.go:64] FLAG: --cluster-domain="" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341884 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341887 2579 flags.go:64] FLAG: --config-dir="" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341890 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341893 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341896 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341900 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341903 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341906 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341909 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341912 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341914 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341917 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341920 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341924 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341927 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 15:35:35.344847 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341930 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341933 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341935 2579 flags.go:64] FLAG: --enable-server="true" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341953 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341957 2579 flags.go:64] FLAG: --event-burst="100" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341960 2579 flags.go:64] FLAG: --event-qps="50" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341963 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341966 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341970 2579 flags.go:64] FLAG: --eviction-hard="" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341974 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341977 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341980 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341983 2579 flags.go:64] FLAG: --eviction-soft="" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341986 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341988 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341991 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341994 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341997 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.341999 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342002 2579 flags.go:64] FLAG: --feature-gates="" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342006 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342008 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342012 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342015 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342018 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342021 2579 flags.go:64] FLAG: --help="false" Apr 21 15:35:35.345466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342024 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-132-141.ec2.internal" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342027 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342030 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342033 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342036 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342039 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342042 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342045 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342048 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342050 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342053 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342056 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342058 2579 flags.go:64] FLAG: --kube-reserved="" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342061 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342065 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342068 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342070 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342073 2579 flags.go:64] FLAG: --lock-file="" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342076 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342079 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342082 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342087 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342090 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342092 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 15:35:35.346112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342095 2579 flags.go:64] FLAG: --logging-format="text" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342098 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342101 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342104 2579 flags.go:64] FLAG: --manifest-url="" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342107 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342111 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342114 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342119 2579 flags.go:64] FLAG: --max-pods="110" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342121 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342125 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342127 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342130 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342133 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342136 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342139 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342146 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342149 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342152 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342155 2579 flags.go:64] FLAG: --pod-cidr="" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342158 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342166 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342169 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342172 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342175 2579 flags.go:64] FLAG: --port="10250" Apr 21 15:35:35.346723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342178 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342181 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c0a6a3bdb0f47ddd" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342184 2579 flags.go:64] FLAG: --qos-reserved="" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342187 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342190 2579 flags.go:64] FLAG: --register-node="true" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342193 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342196 2579 flags.go:64] FLAG: --register-with-taints="" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342199 2579 flags.go:64] FLAG: --registry-burst="10" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342202 2579 flags.go:64] FLAG: --registry-qps="5" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342205 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342208 2579 flags.go:64] FLAG: --reserved-memory="" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342211 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342214 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342217 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342220 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342224 2579 flags.go:64] FLAG: --runonce="false" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342226 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342229 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342232 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342235 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342238 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342241 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342244 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342247 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342249 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342252 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 15:35:35.347326 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342255 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342258 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342261 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342264 2579 flags.go:64] FLAG: --system-cgroups="" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342267 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342273 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342276 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342279 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342283 2579 flags.go:64] FLAG: --tls-min-version="" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342286 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342288 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342291 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342294 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342297 2579 flags.go:64] FLAG: --v="2" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342301 2579 flags.go:64] FLAG: --version="false" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342305 2579 flags.go:64] FLAG: --vmodule="" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342309 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.342312 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342403 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342407 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342411 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342414 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342417 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:35.347977 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342420 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342423 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342425 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342428 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342430 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342433 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342435 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342438 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342440 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342446 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342448 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342451 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342453 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342456 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342459 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342462 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342464 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342467 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342469 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342472 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:35.348552 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342475 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342477 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342480 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342482 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342485 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342487 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342490 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342492 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342494 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342497 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342499 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342502 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342504 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342507 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342509 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342512 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342514 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342517 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342519 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342522 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:35.349097 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342524 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342528 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342530 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342532 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342535 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342538 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342540 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342543 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342545 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342548 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342551 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342553 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342555 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342558 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342561 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342564 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342566 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342569 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342571 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342574 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:35.349570 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342576 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342579 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342581 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342583 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342586 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342589 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342593 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342598 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342601 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342603 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342606 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342608 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342610 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342614 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342617 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342620 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342624 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342627 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342629 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:35.350104 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342632 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:35.350573 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.342634 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:35.350573 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.343460 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:35.350573 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.350060 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 15:35:35.350573 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.350076 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 15:35:35.350573 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350124 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:35.350573 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350128 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:35.350573 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350131 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:35.350573 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350135 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:35.350573 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350138 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:35.350573 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350141 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:35.350573 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350143 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:35.350573 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350146 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:35.350573 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350148 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:35.350573 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350151 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:35.350573 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350154 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:35.350573 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350157 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350159 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350163 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350165 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350168 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350171 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350173 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350175 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350178 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350181 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350183 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350185 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350188 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350190 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350192 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350195 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350197 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350200 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350202 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350205 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:35.350995 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350208 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350211 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350213 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350216 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350218 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350221 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350223 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350226 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350229 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350231 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350234 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350236 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350239 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350241 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350243 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350247 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350249 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350251 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350254 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350257 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:35.351481 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350259 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350261 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350264 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350266 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350268 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350271 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350274 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350276 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350279 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350281 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350284 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350287 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350291 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350296 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350299 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350302 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350305 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350308 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350310 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:35.352045 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350313 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:35.352517 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350315 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:35.352517 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350317 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:35.352517 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350320 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:35.352517 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350322 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:35.352517 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350324 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:35.352517 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350327 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:35.352517 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350330 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:35.352517 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350332 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:35.352517 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350335 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:35.352517 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350338 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:35.352517 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350340 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:35.352517 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350344 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:35.352517 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350347 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:35.352517 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350350 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:35.352517 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350353 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:35.352880 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.350357 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:35.352880 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350476 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:35.352880 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350483 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:35.352880 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350486 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:35.352880 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350489 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:35.352880 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350491 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:35.352880 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350494 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:35.352880 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350496 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:35.352880 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350499 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:35.352880 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350502 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:35.352880 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350505 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:35.352880 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350507 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:35.352880 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350510 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:35.352880 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350512 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:35.352880 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350515 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:35.352880 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350517 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350520 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350523 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350526 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350530 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350532 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350535 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350537 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350540 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350542 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350545 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350552 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350556 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350560 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350562 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350565 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350568 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350570 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350573 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:35.353338 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350575 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350578 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350581 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350583 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350586 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350588 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350591 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350593 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350596 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350598 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350601 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350603 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350606 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350608 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350611 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350613 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350615 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350618 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350620 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350622 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:35.353803 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350625 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350627 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350629 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350632 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350634 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350643 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350645 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350648 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350650 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350652 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350655 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350657 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350659 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350662 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350664 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350667 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350669 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350671 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350674 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350676 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:35.354305 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350679 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:35.354793 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350681 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:35.354793 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350684 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:35.354793 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350686 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:35.354793 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350688 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:35.354793 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350691 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:35.354793 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350693 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:35.354793 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350695 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:35.354793 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350698 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:35.354793 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350700 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:35.354793 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350703 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:35.354793 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350705 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:35.354793 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:35.350707 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:35.354793 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.350712 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:35.354793 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.351444 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 15:35:35.354793 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.354117 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 15:35:35.355179 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.355041 2579 server.go:1019] "Starting client certificate rotation" Apr 21 15:35:35.355179 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.355142 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:35:35.355816 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.355805 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:35:35.392782 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.392759 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:35:35.395225 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.395196 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:35:35.413819 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.413796 2579 log.go:25] "Validated CRI v1 runtime API" Apr 21 15:35:35.420644 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.420624 2579 log.go:25] "Validated CRI v1 image API" Apr 21 15:35:35.421828 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.421814 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 15:35:35.426419 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.426402 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:35:35.428159 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.428137 2579 fs.go:135] Filesystem UUIDs: map[0663da50-b965-4fbd-a614-b32f569d0613:/dev/nvme0n1p4 4b6215af-feb4-468a-aad3-7fcc7958de45:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 21 15:35:35.428203 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.428160 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 15:35:35.434405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.434295 2579 manager.go:217] Machine: {Timestamp:2026-04-21 15:35:35.433179902 +0000 UTC m=+0.461346689 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100444 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec272cf1315ce778f21f4e980f357a12 SystemUUID:ec272cf1-315c-e778-f21f-4e980f357a12 BootID:dc769067-efc5-4a4d-aa2d-b3b7539a6281 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:93:a8:c6:f2:47 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:93:a8:c6:f2:47 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:92:80:d7:dc:07:85 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 15:35:35.434405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.434401 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 15:35:35.434532 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.434520 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 15:35:35.435969 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.435930 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 15:35:35.436117 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.435972 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-141.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 15:35:35.436159 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.436129 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 15:35:35.436159 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.436138 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 15:35:35.436159 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.436155 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:35:35.436245 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.436171 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:35:35.437456 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.437446 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:35:35.437574 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.437565 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 15:35:35.440781 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.440770 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 21 15:35:35.440819 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.440785 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 15:35:35.440819 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.440801 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 15:35:35.440819 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.440811 2579 kubelet.go:397] "Adding apiserver pod source" Apr 21 15:35:35.440895 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.440827 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 15:35:35.442281 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.442258 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:35:35.442373 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.442289 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:35:35.446788 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.446766 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 15:35:35.448064 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.448051 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 15:35:35.449746 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.449731 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 15:35:35.449746 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.449749 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 15:35:35.449857 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.449756 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 15:35:35.449857 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.449762 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 15:35:35.449857 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.449768 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 15:35:35.449857 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.449774 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 15:35:35.449857 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.449782 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 15:35:35.449857 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.449789 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 15:35:35.449857 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.449796 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 15:35:35.449857 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.449802 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 15:35:35.449857 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.449811 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 15:35:35.449857 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.449820 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 15:35:35.450748 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.450737 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 15:35:35.450748 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.450748 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 15:35:35.451339 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:35.451300 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 15:35:35.451339 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:35.451322 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-141.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 15:35:35.453965 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.453926 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hfkp8" Apr 21 15:35:35.454466 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.454455 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 15:35:35.454505 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.454496 2579 server.go:1295] "Started kubelet" Apr 21 15:35:35.454636 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.454583 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 15:35:35.455159 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.455122 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 15:35:35.455192 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.455178 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 15:35:35.455349 ip-10-0-132-141 systemd[1]: Started Kubernetes Kubelet. Apr 21 15:35:35.457033 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.457012 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 15:35:35.457375 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.457361 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 21 15:35:35.461008 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.460988 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hfkp8" Apr 21 15:35:35.461454 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.461440 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-141.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 15:35:35.463265 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:35.461438 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-141.ec2.internal.18a869366bbea747 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-141.ec2.internal,UID:ip-10-0-132-141.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-141.ec2.internal,},FirstTimestamp:2026-04-21 15:35:35.454467911 +0000 UTC m=+0.482634698,LastTimestamp:2026-04-21 15:35:35.454467911 +0000 UTC m=+0.482634698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-141.ec2.internal,}" Apr 21 15:35:35.463265 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.463173 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 15:35:35.463674 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.463651 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 15:35:35.464239 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.464213 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 15:35:35.464239 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.464219 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 15:35:35.464383 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.464252 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 15:35:35.464437 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.464404 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 21 15:35:35.464437 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.464414 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 21 15:35:35.464531 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.464436 2579 factory.go:55] Registering systemd factory Apr 21 15:35:35.464531 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.464456 2579 factory.go:223] Registration of the systemd container factory successfully Apr 21 15:35:35.464615 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:35.464593 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-141.ec2.internal\" not found" Apr 21 15:35:35.465477 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.464774 2579 factory.go:153] Registering CRI-O factory Apr 21 15:35:35.465477 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.464792 2579 factory.go:223] Registration of the crio container factory successfully Apr 21 15:35:35.465477 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.464879 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 15:35:35.465477 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.464903 2579 factory.go:103] Registering Raw factory Apr 21 15:35:35.465477 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.464919 2579 manager.go:1196] Started watching for new ooms in manager Apr 21 15:35:35.465477 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.465425 2579 manager.go:319] Starting recovery of all containers Apr 21 15:35:35.473510 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.473481 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:35.474276 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.474246 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 15:35:35.476345 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.476281 2579 manager.go:324] Recovery completed Apr 21 15:35:35.478100 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:35.477434 2579 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-141.ec2.internal\" not found" node="ip-10-0-132-141.ec2.internal" Apr 21 15:35:35.478519 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:35.478498 2579 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 21 15:35:35.481480 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.481465 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:35.484004 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.483991 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:35.484138 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.484020 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:35.484138 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.484033 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:35.484566 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.484551 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 15:35:35.484566 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.484566 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 15:35:35.484657 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.484584 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:35:35.486796 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.486785 2579 policy_none.go:49] "None policy: Start" Apr 21 15:35:35.486841 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.486801 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 15:35:35.486841 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.486810 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 21 15:35:35.526509 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.526492 2579 manager.go:341] "Starting Device Plugin manager" Apr 21 15:35:35.538617 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:35.526583 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 15:35:35.538617 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.526598 2579 server.go:85] "Starting device plugin registration server" Apr 21 15:35:35.538617 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.526836 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 15:35:35.538617 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.526846 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 15:35:35.538617 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.526964 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 15:35:35.538617 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.527103 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 15:35:35.538617 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.527113 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 15:35:35.538617 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:35.527517 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 15:35:35.538617 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:35.527558 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-141.ec2.internal\" not found" Apr 21 15:35:35.603707 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.603627 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 15:35:35.603707 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.603666 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 15:35:35.603707 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.603685 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 15:35:35.603707 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.603692 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 15:35:35.604011 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:35.603724 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 15:35:35.606258 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.606238 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:35.627366 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.627341 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:35.628293 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.628266 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:35.628398 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.628302 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:35.628398 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.628317 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:35.628398 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.628346 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-141.ec2.internal" Apr 21 15:35:35.637886 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.637870 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-141.ec2.internal" Apr 21 15:35:35.637985 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:35.637893 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-141.ec2.internal\": node \"ip-10-0-132-141.ec2.internal\" not found" Apr 21 15:35:35.660715 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:35.660689 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-141.ec2.internal\" not found" Apr 21 15:35:35.704675 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.704641 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-141.ec2.internal"] Apr 21 15:35:35.704757 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.704715 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:35.706408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.706393 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:35.706500 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.706419 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:35.706500 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.706429 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:35.707610 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.707598 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:35.707734 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.707721 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal" Apr 21 15:35:35.707770 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.707754 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:35.708388 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.708371 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:35.708484 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.708394 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:35.708484 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.708404 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:35.708484 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.708450 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:35.708484 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.708473 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:35.708668 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.708486 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:35.709439 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.709424 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-141.ec2.internal" Apr 21 15:35:35.709518 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.709455 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:35.710225 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.710200 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:35.710317 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.710232 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:35.710317 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.710246 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:35.741547 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:35.741521 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-141.ec2.internal\" not found" node="ip-10-0-132-141.ec2.internal" Apr 21 15:35:35.745993 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:35.745977 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-141.ec2.internal\" not found" node="ip-10-0-132-141.ec2.internal" Apr 21 15:35:35.761351 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:35.761333 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-141.ec2.internal\" not found" Apr 21 15:35:35.766689 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.766675 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ceaff6e191396af72f395bd243596937-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal\" (UID: \"ceaff6e191396af72f395bd243596937\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal" Apr 21 15:35:35.766744 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.766699 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/546d5bb4e1c933cd7732e2f27360ece8-config\") pod \"kube-apiserver-proxy-ip-10-0-132-141.ec2.internal\" (UID: \"546d5bb4e1c933cd7732e2f27360ece8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-141.ec2.internal" Apr 21 15:35:35.766744 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.766723 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ceaff6e191396af72f395bd243596937-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal\" (UID: \"ceaff6e191396af72f395bd243596937\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal" Apr 21 15:35:35.861684 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:35.861605 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-141.ec2.internal\" not found" Apr 21 15:35:35.866966 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.866928 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ceaff6e191396af72f395bd243596937-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal\" (UID: \"ceaff6e191396af72f395bd243596937\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal" Apr 21 15:35:35.867055 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.866987 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ceaff6e191396af72f395bd243596937-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal\" (UID: \"ceaff6e191396af72f395bd243596937\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal" Apr 21 15:35:35.867055 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.866995 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ceaff6e191396af72f395bd243596937-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal\" (UID: \"ceaff6e191396af72f395bd243596937\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal" Apr 21 15:35:35.867055 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.867011 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/546d5bb4e1c933cd7732e2f27360ece8-config\") pod \"kube-apiserver-proxy-ip-10-0-132-141.ec2.internal\" (UID: \"546d5bb4e1c933cd7732e2f27360ece8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-141.ec2.internal" Apr 21 15:35:35.867055 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.867043 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/546d5bb4e1c933cd7732e2f27360ece8-config\") pod \"kube-apiserver-proxy-ip-10-0-132-141.ec2.internal\" (UID: \"546d5bb4e1c933cd7732e2f27360ece8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-141.ec2.internal" Apr 21 15:35:35.867199 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:35.867063 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ceaff6e191396af72f395bd243596937-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal\" (UID: \"ceaff6e191396af72f395bd243596937\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal" Apr 21 15:35:35.962415 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:35.962373 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-141.ec2.internal\" not found" Apr 21 15:35:36.043868 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.043832 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal" Apr 21 15:35:36.048479 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.048460 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-141.ec2.internal" Apr 21 15:35:36.063749 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:36.063076 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-141.ec2.internal\" not found" Apr 21 15:35:36.164317 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:36.164212 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-141.ec2.internal\" not found" Apr 21 15:35:36.264682 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:36.264630 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-141.ec2.internal\" not found" Apr 21 15:35:36.355093 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.355055 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 15:35:36.355745 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.355200 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 15:35:36.355745 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.355212 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 15:35:36.365294 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:36.365259 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-141.ec2.internal\" not found" Apr 21 15:35:36.424160 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.424101 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:36.441170 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.441153 2579 apiserver.go:52] "Watching apiserver" Apr 21 15:35:36.453883 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.453851 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 15:35:36.454225 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.454201 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-2xzlk","openshift-multus/multus-tcxzq","openshift-network-diagnostics/network-check-target-zmzmd","openshift-network-operator/iptables-alerter-4v6vb","openshift-ovn-kubernetes/ovnkube-node-7tbgn","openshift-image-registry/node-ca-fj5g2","openshift-multus/multus-additional-cni-plugins-mz26j","openshift-multus/network-metrics-daemon-x5zkt","kube-system/konnectivity-agent-crzt4","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f","openshift-cluster-node-tuning-operator/tuned-bgm7g"] Apr 21 15:35:36.456530 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.456508 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2xzlk" Apr 21 15:35:36.457561 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.457526 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.457680 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.457587 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:36.457680 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:36.457665 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zmzmd" podUID="f9899d71-8c55-4ec8-929c-ab8f3dcf09e9" Apr 21 15:35:36.458721 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.458703 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4v6vb" Apr 21 15:35:36.459805 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.459791 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.461033 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.461020 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.461709 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.461350 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 15:35:36.462313 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.461883 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 15:35:36.462313 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.461978 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 15:35:36.462313 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.461986 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:36.462313 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.462025 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 15:35:36.462313 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.461980 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 15:35:36.462568 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.462556 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 15:35:36.463919 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.462604 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 15:30:35 +0000 UTC" deadline="2028-01-11 00:37:58.456317522 +0000 UTC" Apr 21 15:35:36.463919 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.462630 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15105h2m21.993691172s" Apr 21 15:35:36.463919 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.462723 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-5qjrv\"" Apr 21 15:35:36.463919 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.462725 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8q28v\"" Apr 21 15:35:36.463919 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.462787 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 15:35:36.463919 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.463077 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 15:35:36.463919 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.463461 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 15:35:36.463919 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.463562 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qxmnr\"" Apr 21 15:35:36.463919 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.463461 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4fz95\"" Apr 21 15:35:36.463919 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.463739 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 15:35:36.463919 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.463785 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal" Apr 21 15:35:36.464772 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.464756 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:36.464892 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:36.464868 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5zkt" podUID="088bc7e8-4515-4c77-967b-a70ef32cd85e" Apr 21 15:35:36.466527 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.466158 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-crzt4" Apr 21 15:35:36.467642 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.467422 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 15:35:36.467642 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.467527 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 15:35:36.467642 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.467532 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 15:35:36.467642 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.467634 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 15:35:36.467892 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.467725 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-j7cc2\"" Apr 21 15:35:36.467892 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.467757 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 15:35:36.467892 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.467856 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 15:35:36.468108 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.468091 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 15:35:36.468771 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.468754 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.469532 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469511 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-run-netns\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.469630 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469538 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-hostroot\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.469630 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469553 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-run-openvswitch\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.469630 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469567 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/df61a406-9ba7-4b2d-94b8-03e6d97a8118-ovn-node-metrics-cert\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.469630 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469583 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-systemd-units\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.469820 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469631 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-var-lib-openvswitch\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.469820 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469665 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/df61a406-9ba7-4b2d-94b8-03e6d97a8118-env-overrides\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.469820 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469696 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b151d377-fb3e-44d5-a5e1-57bb572347d7-os-release\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.469820 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469718 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-var-lib-cni-multus\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.469820 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469740 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-var-lib-kubelet\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.469820 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469759 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a396e4e6-5a05-450a-8a8c-263dd9674c34-tmp-dir\") pod \"node-resolver-2xzlk\" (UID: \"a396e4e6-5a05-450a-8a8c-263dd9674c34\") " pod="openshift-dns/node-resolver-2xzlk" Apr 21 15:35:36.469820 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469773 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5rk9\" (UniqueName: \"kubernetes.io/projected/a396e4e6-5a05-450a-8a8c-263dd9674c34-kube-api-access-k5rk9\") pod \"node-resolver-2xzlk\" (UID: \"a396e4e6-5a05-450a-8a8c-263dd9674c34\") " pod="openshift-dns/node-resolver-2xzlk" Apr 21 15:35:36.469820 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469788 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kxqn\" (UniqueName: \"kubernetes.io/projected/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-kube-api-access-4kxqn\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.470189 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469824 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-slash\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.470189 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469849 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-run-systemd\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.470189 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469910 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-etc-openvswitch\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.470189 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469933 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b151d377-fb3e-44d5-a5e1-57bb572347d7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.470189 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.469992 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-os-release\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.470189 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470021 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b151d377-fb3e-44d5-a5e1-57bb572347d7-cnibin\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.470189 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470046 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a396e4e6-5a05-450a-8a8c-263dd9674c34-hosts-file\") pod \"node-resolver-2xzlk\" (UID: \"a396e4e6-5a05-450a-8a8c-263dd9674c34\") " pod="openshift-dns/node-resolver-2xzlk" Apr 21 15:35:36.470189 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470067 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldhbb\" (UniqueName: \"kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb\") pod \"network-check-target-zmzmd\" (UID: \"f9899d71-8c55-4ec8-929c-ab8f3dcf09e9\") " pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:36.470189 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470085 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-var-lib-cni-bin\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.470189 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470106 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-run-netns\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.470189 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470134 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.470189 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470162 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e5ace7b-a572-41f7-a65f-d0a88596a32b-host-slash\") pod \"iptables-alerter-4v6vb\" (UID: \"6e5ace7b-a572-41f7-a65f-d0a88596a32b\") " pod="openshift-network-operator/iptables-alerter-4v6vb" Apr 21 15:35:36.470189 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470171 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fj5g2" Apr 21 15:35:36.470189 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470187 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-system-cni-dir\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.470833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470216 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-multus-daemon-config\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.470833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470240 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-run-multus-certs\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.470833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470263 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-run-ovn\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.470833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470287 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/df61a406-9ba7-4b2d-94b8-03e6d97a8118-ovnkube-config\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.470833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470318 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmn96\" (UniqueName: \"kubernetes.io/projected/b151d377-fb3e-44d5-a5e1-57bb572347d7-kube-api-access-hmn96\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.470833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470339 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-cnibin\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.470833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470359 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-cni-binary-copy\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.470833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470379 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/df61a406-9ba7-4b2d-94b8-03e6d97a8118-ovnkube-script-lib\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.470833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470445 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-multus-cni-dir\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.470833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470475 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-multus-conf-dir\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.470833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470501 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fmw6\" (UniqueName: \"kubernetes.io/projected/df61a406-9ba7-4b2d-94b8-03e6d97a8118-kube-api-access-2fmw6\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.470833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470554 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b151d377-fb3e-44d5-a5e1-57bb572347d7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.470833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470594 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jl4j\" (UniqueName: \"kubernetes.io/projected/6e5ace7b-a572-41f7-a65f-d0a88596a32b-kube-api-access-7jl4j\") pod \"iptables-alerter-4v6vb\" (UID: \"6e5ace7b-a572-41f7-a65f-d0a88596a32b\") " pod="openshift-network-operator/iptables-alerter-4v6vb" Apr 21 15:35:36.470833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470627 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lmbv\" (UniqueName: \"kubernetes.io/projected/088bc7e8-4515-4c77-967b-a70ef32cd85e-kube-api-access-4lmbv\") pod \"network-metrics-daemon-x5zkt\" (UID: \"088bc7e8-4515-4c77-967b-a70ef32cd85e\") " pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:36.470833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470684 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b151d377-fb3e-44d5-a5e1-57bb572347d7-cni-binary-copy\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.470833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470732 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-kubelet\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.471626 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470757 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-cni-bin\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.471626 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470791 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-multus-socket-dir-parent\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.471626 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470815 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-run-ovn-kubernetes\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.471626 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470838 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-cni-netd\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.471626 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470861 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6e5ace7b-a572-41f7-a65f-d0a88596a32b-iptables-alerter-script\") pod \"iptables-alerter-4v6vb\" (UID: \"6e5ace7b-a572-41f7-a65f-d0a88596a32b\") " pod="openshift-network-operator/iptables-alerter-4v6vb" Apr 21 15:35:36.471626 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470891 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-run-k8s-cni-cncf-io\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.471626 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470915 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b151d377-fb3e-44d5-a5e1-57bb572347d7-system-cni-dir\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.471626 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.470987 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-etc-kubernetes\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.471626 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.471012 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b151d377-fb3e-44d5-a5e1-57bb572347d7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.471626 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.471041 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs\") pod \"network-metrics-daemon-x5zkt\" (UID: \"088bc7e8-4515-4c77-967b-a70ef32cd85e\") " pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:36.471626 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.471064 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-node-log\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.471626 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.471085 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-log-socket\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.471626 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.471357 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.471626 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.471415 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 15:35:36.471626 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.471622 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 15:35:36.472311 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.471745 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-v64vk\"" Apr 21 15:35:36.472311 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.471986 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 15:35:36.472311 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.471998 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 15:35:36.472311 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.472047 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sp8fc\"" Apr 21 15:35:36.475348 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.475329 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 15:35:36.475444 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.475398 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 15:35:36.475788 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.475771 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 15:35:36.476390 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.476378 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 15:35:36.478546 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.478529 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5ftvd\"" Apr 21 15:35:36.478715 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.478681 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:36.478761 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.478745 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 15:35:36.478793 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.478779 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-mtn8s\"" Apr 21 15:35:36.490130 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.490112 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:35:36.499625 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.499607 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal"] Apr 21 15:35:36.500373 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.500359 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:35:36.500430 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.500416 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-141.ec2.internal" Apr 21 15:35:36.514348 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.514325 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wqhvj" Apr 21 15:35:36.515138 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.515124 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-132-141.ec2.internal"] Apr 21 15:35:36.515470 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.515459 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:35:36.525749 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.525728 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wqhvj" Apr 21 15:35:36.547135 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:36.547102 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod546d5bb4e1c933cd7732e2f27360ece8.slice/crio-1735f030005c5c5713321f7974ff6df3b11796827081e198466f6556fb43c6e8 WatchSource:0}: Error finding container 1735f030005c5c5713321f7974ff6df3b11796827081e198466f6556fb43c6e8: Status 404 returned error can't find the container with id 1735f030005c5c5713321f7974ff6df3b11796827081e198466f6556fb43c6e8 Apr 21 15:35:36.547359 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:36.547338 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceaff6e191396af72f395bd243596937.slice/crio-9d96da81b5ae3de812da467c1a4cca9f08403d400309f384ff9930b7c5762cbf WatchSource:0}: Error finding container 9d96da81b5ae3de812da467c1a4cca9f08403d400309f384ff9930b7c5762cbf: Status 404 returned error can't find the container with id 9d96da81b5ae3de812da467c1a4cca9f08403d400309f384ff9930b7c5762cbf Apr 21 15:35:36.551668 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.551652 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:35:36.565224 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.565203 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 15:35:36.571237 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571216 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-run-ovn\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.571311 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571255 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/df61a406-9ba7-4b2d-94b8-03e6d97a8118-ovnkube-config\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.571349 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571310 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-run-ovn\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.571385 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571348 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmn96\" (UniqueName: \"kubernetes.io/projected/b151d377-fb3e-44d5-a5e1-57bb572347d7-kube-api-access-hmn96\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.571385 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571372 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-run\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.571460 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571392 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x765\" (UniqueName: \"kubernetes.io/projected/7b048749-149f-4a73-9f47-3ff1c3622ead-kube-api-access-8x765\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.571520 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571503 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-cnibin\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.571561 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571538 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-cni-binary-copy\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.571614 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571559 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/df61a406-9ba7-4b2d-94b8-03e6d97a8118-ovnkube-script-lib\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.571614 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571583 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/657382f2-3c88-4d85-b5cf-5533d6e4b19e-serviceca\") pod \"node-ca-fj5g2\" (UID: \"657382f2-3c88-4d85-b5cf-5533d6e4b19e\") " pod="openshift-image-registry/node-ca-fj5g2" Apr 21 15:35:36.571695 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571624 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-cnibin\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.571736 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571698 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-kubernetes\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.571736 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571727 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-multus-cni-dir\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.571812 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571752 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-sysctl-d\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.571812 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571777 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-sys\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.571909 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571838 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-multus-cni-dir\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.571909 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571873 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/df61a406-9ba7-4b2d-94b8-03e6d97a8118-ovnkube-config\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.571909 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571884 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-multus-conf-dir\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.572073 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571914 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fmw6\" (UniqueName: \"kubernetes.io/projected/df61a406-9ba7-4b2d-94b8-03e6d97a8118-kube-api-access-2fmw6\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.572073 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571934 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-multus-conf-dir\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.572073 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571959 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b151d377-fb3e-44d5-a5e1-57bb572347d7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.572073 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.571986 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jl4j\" (UniqueName: \"kubernetes.io/projected/6e5ace7b-a572-41f7-a65f-d0a88596a32b-kube-api-access-7jl4j\") pod \"iptables-alerter-4v6vb\" (UID: \"6e5ace7b-a572-41f7-a65f-d0a88596a32b\") " pod="openshift-network-operator/iptables-alerter-4v6vb" Apr 21 15:35:36.572073 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572011 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lmbv\" (UniqueName: \"kubernetes.io/projected/088bc7e8-4515-4c77-967b-a70ef32cd85e-kube-api-access-4lmbv\") pod \"network-metrics-daemon-x5zkt\" (UID: \"088bc7e8-4515-4c77-967b-a70ef32cd85e\") " pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:36.572073 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572035 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b151d377-fb3e-44d5-a5e1-57bb572347d7-cni-binary-copy\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.572073 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572064 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25gln\" (UniqueName: \"kubernetes.io/projected/657382f2-3c88-4d85-b5cf-5533d6e4b19e-kube-api-access-25gln\") pod \"node-ca-fj5g2\" (UID: \"657382f2-3c88-4d85-b5cf-5533d6e4b19e\") " pod="openshift-image-registry/node-ca-fj5g2" Apr 21 15:35:36.572405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572088 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-host\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.572405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572089 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b151d377-fb3e-44d5-a5e1-57bb572347d7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.572405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572108 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/df61a406-9ba7-4b2d-94b8-03e6d97a8118-ovnkube-script-lib\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.572405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572124 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-kubelet\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.572405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572141 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-cni-binary-copy\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.572405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572163 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-cni-bin\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.572405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572197 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-cni-bin\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.572405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572205 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-modprobe-d\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.572405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572239 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-sysctl-conf\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.572405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572240 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-kubelet\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.572405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572260 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtzh9\" (UniqueName: \"kubernetes.io/projected/205b70c4-2794-4a50-8e82-7285027e2f8d-kube-api-access-jtzh9\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.572405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572277 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-sys-fs\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.572405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572294 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-multus-socket-dir-parent\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.572405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572317 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-run-ovn-kubernetes\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.572405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572338 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-cni-netd\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.572405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572345 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-multus-socket-dir-parent\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.572405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572353 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6e5ace7b-a572-41f7-a65f-d0a88596a32b-iptables-alerter-script\") pod \"iptables-alerter-4v6vb\" (UID: \"6e5ace7b-a572-41f7-a65f-d0a88596a32b\") " pod="openshift-network-operator/iptables-alerter-4v6vb" Apr 21 15:35:36.573234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572374 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-var-lib-kubelet\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.573234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572391 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-tuned\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.573234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572398 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-run-ovn-kubernetes\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.573234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572404 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-cni-netd\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.573234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572406 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.573234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572435 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-socket-dir\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.573234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572465 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-run-k8s-cni-cncf-io\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.573234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572485 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b151d377-fb3e-44d5-a5e1-57bb572347d7-system-cni-dir\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.573234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572500 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-sysconfig\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.573234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572516 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a7482d21-f4ae-4fe1-a26f-d1de8cd73926-konnectivity-ca\") pod \"konnectivity-agent-crzt4\" (UID: \"a7482d21-f4ae-4fe1-a26f-d1de8cd73926\") " pod="kube-system/konnectivity-agent-crzt4" Apr 21 15:35:36.573234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572520 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b151d377-fb3e-44d5-a5e1-57bb572347d7-system-cni-dir\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.573234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572501 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-run-k8s-cni-cncf-io\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.573234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-etc-kubernetes\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.573234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572562 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b151d377-fb3e-44d5-a5e1-57bb572347d7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.573234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572577 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs\") pod \"network-metrics-daemon-x5zkt\" (UID: \"088bc7e8-4515-4c77-967b-a70ef32cd85e\") " pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:36.573234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572594 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/657382f2-3c88-4d85-b5cf-5533d6e4b19e-host\") pod \"node-ca-fj5g2\" (UID: \"657382f2-3c88-4d85-b5cf-5533d6e4b19e\") " pod="openshift-image-registry/node-ca-fj5g2" Apr 21 15:35:36.573234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572616 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b151d377-fb3e-44d5-a5e1-57bb572347d7-cni-binary-copy\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.573902 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572618 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a7482d21-f4ae-4fe1-a26f-d1de8cd73926-agent-certs\") pod \"konnectivity-agent-crzt4\" (UID: \"a7482d21-f4ae-4fe1-a26f-d1de8cd73926\") " pod="kube-system/konnectivity-agent-crzt4" Apr 21 15:35:36.573902 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572638 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-etc-kubernetes\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.573902 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572660 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-device-dir\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.573902 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:36.572676 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:36.573902 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572678 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-node-log\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.573902 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572707 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-node-log\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.573902 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:36.572738 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs podName:088bc7e8-4515-4c77-967b-a70ef32cd85e nodeName:}" failed. No retries permitted until 2026-04-21 15:35:37.07271108 +0000 UTC m=+2.100877868 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs") pod "network-metrics-daemon-x5zkt" (UID: "088bc7e8-4515-4c77-967b-a70ef32cd85e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:36.573902 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572754 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-log-socket\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.573902 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572761 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6e5ace7b-a572-41f7-a65f-d0a88596a32b-iptables-alerter-script\") pod \"iptables-alerter-4v6vb\" (UID: \"6e5ace7b-a572-41f7-a65f-d0a88596a32b\") " pod="openshift-network-operator/iptables-alerter-4v6vb" Apr 21 15:35:36.573902 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572770 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-run-netns\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.573902 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572789 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-hostroot\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.573902 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572817 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-run-openvswitch\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.573902 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572826 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-run-netns\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.573902 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572830 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-log-socket\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.573902 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572841 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/df61a406-9ba7-4b2d-94b8-03e6d97a8118-ovn-node-metrics-cert\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.573902 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572846 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-hostroot\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.573902 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572869 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/205b70c4-2794-4a50-8e82-7285027e2f8d-tmp\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572876 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-run-openvswitch\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572892 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-systemd-units\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572913 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-var-lib-openvswitch\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572930 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/df61a406-9ba7-4b2d-94b8-03e6d97a8118-env-overrides\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572975 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b151d377-fb3e-44d5-a5e1-57bb572347d7-os-release\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572981 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-var-lib-openvswitch\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572915 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-systemd-units\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.572992 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-var-lib-cni-multus\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573016 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-var-lib-cni-multus\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573023 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-var-lib-kubelet\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573053 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a396e4e6-5a05-450a-8a8c-263dd9674c34-tmp-dir\") pod \"node-resolver-2xzlk\" (UID: \"a396e4e6-5a05-450a-8a8c-263dd9674c34\") " pod="openshift-dns/node-resolver-2xzlk" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573039 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b151d377-fb3e-44d5-a5e1-57bb572347d7-os-release\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573055 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-var-lib-kubelet\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573082 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b151d377-fb3e-44d5-a5e1-57bb572347d7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573126 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5rk9\" (UniqueName: \"kubernetes.io/projected/a396e4e6-5a05-450a-8a8c-263dd9674c34-kube-api-access-k5rk9\") pod \"node-resolver-2xzlk\" (UID: \"a396e4e6-5a05-450a-8a8c-263dd9674c34\") " pod="openshift-dns/node-resolver-2xzlk" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573156 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kxqn\" (UniqueName: \"kubernetes.io/projected/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-kube-api-access-4kxqn\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573181 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-slash\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.574408 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573182 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 15:35:36.574873 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-run-systemd\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.574873 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573227 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-slash\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.574873 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573231 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-etc-openvswitch\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.574873 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573270 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-etc-openvswitch\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.574873 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573281 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b151d377-fb3e-44d5-a5e1-57bb572347d7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.574873 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573285 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a396e4e6-5a05-450a-8a8c-263dd9674c34-tmp-dir\") pod \"node-resolver-2xzlk\" (UID: \"a396e4e6-5a05-450a-8a8c-263dd9674c34\") " pod="openshift-dns/node-resolver-2xzlk" Apr 21 15:35:36.574873 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573308 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/df61a406-9ba7-4b2d-94b8-03e6d97a8118-env-overrides\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.574873 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573316 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-run-systemd\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.574873 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573310 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-lib-modules\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.574873 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573360 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-registration-dir\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.574873 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573390 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-os-release\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.574873 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573414 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b151d377-fb3e-44d5-a5e1-57bb572347d7-cnibin\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.574873 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573430 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a396e4e6-5a05-450a-8a8c-263dd9674c34-hosts-file\") pod \"node-resolver-2xzlk\" (UID: \"a396e4e6-5a05-450a-8a8c-263dd9674c34\") " pod="openshift-dns/node-resolver-2xzlk" Apr 21 15:35:36.574873 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573477 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a396e4e6-5a05-450a-8a8c-263dd9674c34-hosts-file\") pod \"node-resolver-2xzlk\" (UID: \"a396e4e6-5a05-450a-8a8c-263dd9674c34\") " pod="openshift-dns/node-resolver-2xzlk" Apr 21 15:35:36.574873 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573478 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b151d377-fb3e-44d5-a5e1-57bb572347d7-cnibin\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.574873 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573491 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-os-release\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.574873 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573509 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldhbb\" (UniqueName: \"kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb\") pod \"network-check-target-zmzmd\" (UID: \"f9899d71-8c55-4ec8-929c-ab8f3dcf09e9\") " pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:36.575344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573538 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-etc-selinux\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.575344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573639 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-var-lib-cni-bin\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.575344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573657 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b151d377-fb3e-44d5-a5e1-57bb572347d7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.575344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573669 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-run-netns\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.575344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573698 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.575344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573719 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-run-netns\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.575344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573722 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e5ace7b-a572-41f7-a65f-d0a88596a32b-host-slash\") pod \"iptables-alerter-4v6vb\" (UID: \"6e5ace7b-a572-41f7-a65f-d0a88596a32b\") " pod="openshift-network-operator/iptables-alerter-4v6vb" Apr 21 15:35:36.575344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573756 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e5ace7b-a572-41f7-a65f-d0a88596a32b-host-slash\") pod \"iptables-alerter-4v6vb\" (UID: \"6e5ace7b-a572-41f7-a65f-d0a88596a32b\") " pod="openshift-network-operator/iptables-alerter-4v6vb" Apr 21 15:35:36.575344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573701 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-var-lib-cni-bin\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.575344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573757 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-systemd\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.575344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573790 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df61a406-9ba7-4b2d-94b8-03e6d97a8118-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.575344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573799 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-system-cni-dir\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.575344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573820 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-multus-daemon-config\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.575344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573832 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-system-cni-dir\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.575344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573855 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-run-multus-certs\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.575344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.573929 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-host-run-multus-certs\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.575344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.574685 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-multus-daemon-config\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.575929 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.575915 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/df61a406-9ba7-4b2d-94b8-03e6d97a8118-ovn-node-metrics-cert\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.595768 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:36.595753 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:36.595884 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:36.595771 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:36.595884 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:36.595781 2579 projected.go:194] Error preparing data for projected volume kube-api-access-ldhbb for pod openshift-network-diagnostics/network-check-target-zmzmd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:36.595884 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:36.595827 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb podName:f9899d71-8c55-4ec8-929c-ab8f3dcf09e9 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:37.095811776 +0000 UTC m=+2.123978555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ldhbb" (UniqueName: "kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb") pod "network-check-target-zmzmd" (UID: "f9899d71-8c55-4ec8-929c-ab8f3dcf09e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:36.596411 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.596395 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jl4j\" (UniqueName: \"kubernetes.io/projected/6e5ace7b-a572-41f7-a65f-d0a88596a32b-kube-api-access-7jl4j\") pod \"iptables-alerter-4v6vb\" (UID: \"6e5ace7b-a572-41f7-a65f-d0a88596a32b\") " pod="openshift-network-operator/iptables-alerter-4v6vb" Apr 21 15:35:36.598644 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.598629 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lmbv\" (UniqueName: \"kubernetes.io/projected/088bc7e8-4515-4c77-967b-a70ef32cd85e-kube-api-access-4lmbv\") pod \"network-metrics-daemon-x5zkt\" (UID: \"088bc7e8-4515-4c77-967b-a70ef32cd85e\") " pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:36.607396 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.607351 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-141.ec2.internal" event={"ID":"546d5bb4e1c933cd7732e2f27360ece8","Type":"ContainerStarted","Data":"1735f030005c5c5713321f7974ff6df3b11796827081e198466f6556fb43c6e8"} Apr 21 15:35:36.608352 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.608330 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal" event={"ID":"ceaff6e191396af72f395bd243596937","Type":"ContainerStarted","Data":"9d96da81b5ae3de812da467c1a4cca9f08403d400309f384ff9930b7c5762cbf"} Apr 21 15:35:36.620330 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.620313 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5rk9\" (UniqueName: \"kubernetes.io/projected/a396e4e6-5a05-450a-8a8c-263dd9674c34-kube-api-access-k5rk9\") pod \"node-resolver-2xzlk\" (UID: \"a396e4e6-5a05-450a-8a8c-263dd9674c34\") " pod="openshift-dns/node-resolver-2xzlk" Apr 21 15:35:36.626304 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.626288 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kxqn\" (UniqueName: \"kubernetes.io/projected/49b893d4-ef45-4e0b-9df7-4cda6b46fd3d-kube-api-access-4kxqn\") pod \"multus-tcxzq\" (UID: \"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d\") " pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.632138 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.632121 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fmw6\" (UniqueName: \"kubernetes.io/projected/df61a406-9ba7-4b2d-94b8-03e6d97a8118-kube-api-access-2fmw6\") pod \"ovnkube-node-7tbgn\" (UID: \"df61a406-9ba7-4b2d-94b8-03e6d97a8118\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.674993 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.674896 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-systemd\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.674993 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.674931 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-run\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.674993 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.674963 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8x765\" (UniqueName: \"kubernetes.io/projected/7b048749-149f-4a73-9f47-3ff1c3622ead-kube-api-access-8x765\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.674993 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.674989 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/657382f2-3c88-4d85-b5cf-5533d6e4b19e-serviceca\") pod \"node-ca-fj5g2\" (UID: \"657382f2-3c88-4d85-b5cf-5533d6e4b19e\") " pod="openshift-image-registry/node-ca-fj5g2" Apr 21 15:35:36.675216 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675010 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-kubernetes\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675216 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675027 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-run\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675216 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675006 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-systemd\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675216 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675062 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-kubernetes\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675216 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675142 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-sysctl-d\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675216 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675171 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-sys\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675216 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675202 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25gln\" (UniqueName: \"kubernetes.io/projected/657382f2-3c88-4d85-b5cf-5533d6e4b19e-kube-api-access-25gln\") pod \"node-ca-fj5g2\" (UID: \"657382f2-3c88-4d85-b5cf-5533d6e4b19e\") " pod="openshift-image-registry/node-ca-fj5g2" Apr 21 15:35:36.675464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675227 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-host\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675247 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-sys\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675255 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-modprobe-d\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675269 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-sysctl-d\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675279 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-sysctl-conf\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675303 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtzh9\" (UniqueName: \"kubernetes.io/projected/205b70c4-2794-4a50-8e82-7285027e2f8d-kube-api-access-jtzh9\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675313 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-host\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675330 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-modprobe-d\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675331 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-sys-fs\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.675464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675367 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-var-lib-kubelet\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675378 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-sys-fs\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.675464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675383 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/657382f2-3c88-4d85-b5cf-5533d6e4b19e-serviceca\") pod \"node-ca-fj5g2\" (UID: \"657382f2-3c88-4d85-b5cf-5533d6e4b19e\") " pod="openshift-image-registry/node-ca-fj5g2" Apr 21 15:35:36.675464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675392 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-tuned\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675399 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-sysctl-conf\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675427 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-var-lib-kubelet\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.675464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675427 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.675464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675463 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-socket-dir\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.676415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675465 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.676415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675492 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-sysconfig\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.676415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675545 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a7482d21-f4ae-4fe1-a26f-d1de8cd73926-konnectivity-ca\") pod \"konnectivity-agent-crzt4\" (UID: \"a7482d21-f4ae-4fe1-a26f-d1de8cd73926\") " pod="kube-system/konnectivity-agent-crzt4" Apr 21 15:35:36.676415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675537 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-sysconfig\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.676415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675546 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-socket-dir\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.676415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675585 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/657382f2-3c88-4d85-b5cf-5533d6e4b19e-host\") pod \"node-ca-fj5g2\" (UID: \"657382f2-3c88-4d85-b5cf-5533d6e4b19e\") " pod="openshift-image-registry/node-ca-fj5g2" Apr 21 15:35:36.676415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675626 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/657382f2-3c88-4d85-b5cf-5533d6e4b19e-host\") pod \"node-ca-fj5g2\" (UID: \"657382f2-3c88-4d85-b5cf-5533d6e4b19e\") " pod="openshift-image-registry/node-ca-fj5g2" Apr 21 15:35:36.676415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675621 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a7482d21-f4ae-4fe1-a26f-d1de8cd73926-agent-certs\") pod \"konnectivity-agent-crzt4\" (UID: \"a7482d21-f4ae-4fe1-a26f-d1de8cd73926\") " pod="kube-system/konnectivity-agent-crzt4" Apr 21 15:35:36.676415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675667 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-device-dir\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.676415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675695 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/205b70c4-2794-4a50-8e82-7285027e2f8d-tmp\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.676415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675729 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-lib-modules\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.676415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675744 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-registration-dir\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.676415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675749 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-device-dir\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.676415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675773 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-etc-selinux\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.676415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675843 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-etc-selinux\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.676415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675847 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/205b70c4-2794-4a50-8e82-7285027e2f8d-lib-modules\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.676415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.675907 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7b048749-149f-4a73-9f47-3ff1c3622ead-registration-dir\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.677256 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.676051 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a7482d21-f4ae-4fe1-a26f-d1de8cd73926-konnectivity-ca\") pod \"konnectivity-agent-crzt4\" (UID: \"a7482d21-f4ae-4fe1-a26f-d1de8cd73926\") " pod="kube-system/konnectivity-agent-crzt4" Apr 21 15:35:36.677612 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.677593 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/205b70c4-2794-4a50-8e82-7285027e2f8d-etc-tuned\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.677816 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.677795 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a7482d21-f4ae-4fe1-a26f-d1de8cd73926-agent-certs\") pod \"konnectivity-agent-crzt4\" (UID: \"a7482d21-f4ae-4fe1-a26f-d1de8cd73926\") " pod="kube-system/konnectivity-agent-crzt4" Apr 21 15:35:36.677858 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.677818 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/205b70c4-2794-4a50-8e82-7285027e2f8d-tmp\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.678188 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.678173 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmn96\" (UniqueName: \"kubernetes.io/projected/b151d377-fb3e-44d5-a5e1-57bb572347d7-kube-api-access-hmn96\") pod \"multus-additional-cni-plugins-mz26j\" (UID: \"b151d377-fb3e-44d5-a5e1-57bb572347d7\") " pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.688959 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.688915 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x765\" (UniqueName: \"kubernetes.io/projected/7b048749-149f-4a73-9f47-3ff1c3622ead-kube-api-access-8x765\") pod \"aws-ebs-csi-driver-node-pt24f\" (UID: \"7b048749-149f-4a73-9f47-3ff1c3622ead\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.689150 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.689132 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtzh9\" (UniqueName: \"kubernetes.io/projected/205b70c4-2794-4a50-8e82-7285027e2f8d-kube-api-access-jtzh9\") pod \"tuned-bgm7g\" (UID: \"205b70c4-2794-4a50-8e82-7285027e2f8d\") " pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.690447 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.690432 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25gln\" (UniqueName: \"kubernetes.io/projected/657382f2-3c88-4d85-b5cf-5533d6e4b19e-kube-api-access-25gln\") pod \"node-ca-fj5g2\" (UID: \"657382f2-3c88-4d85-b5cf-5533d6e4b19e\") " pod="openshift-image-registry/node-ca-fj5g2" Apr 21 15:35:36.786304 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.786278 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2xzlk" Apr 21 15:35:36.792409 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.792390 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" Apr 21 15:35:36.792718 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:36.792693 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda396e4e6_5a05_450a_8a8c_263dd9674c34.slice/crio-910689c0142f09ed3d11a85997521ac925e4caa7030598469a07a1f06cbc6990 WatchSource:0}: Error finding container 910689c0142f09ed3d11a85997521ac925e4caa7030598469a07a1f06cbc6990: Status 404 returned error can't find the container with id 910689c0142f09ed3d11a85997521ac925e4caa7030598469a07a1f06cbc6990 Apr 21 15:35:36.798898 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:36.798877 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b048749_149f_4a73_9f47_3ff1c3622ead.slice/crio-72061e7f4c9242f68e3d26aab9c45f493222a695abd5fb09f6a4afcafbcf7977 WatchSource:0}: Error finding container 72061e7f4c9242f68e3d26aab9c45f493222a695abd5fb09f6a4afcafbcf7977: Status 404 returned error can't find the container with id 72061e7f4c9242f68e3d26aab9c45f493222a695abd5fb09f6a4afcafbcf7977 Apr 21 15:35:36.811890 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.811873 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tcxzq" Apr 21 15:35:36.817706 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:36.817683 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49b893d4_ef45_4e0b_9df7_4cda6b46fd3d.slice/crio-2e0dc8af63f0f1318da34ac2c5be515cff7f8dd976fad57e76f4d351d384e62c WatchSource:0}: Error finding container 2e0dc8af63f0f1318da34ac2c5be515cff7f8dd976fad57e76f4d351d384e62c: Status 404 returned error can't find the container with id 2e0dc8af63f0f1318da34ac2c5be515cff7f8dd976fad57e76f4d351d384e62c Apr 21 15:35:36.823631 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.823615 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4v6vb" Apr 21 15:35:36.828874 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.828850 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:35:36.831382 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:36.831169 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e5ace7b_a572_41f7_a65f_d0a88596a32b.slice/crio-01fc66dfb4be1d8e2be72703dc49590cf5b550af8e70f9c96e76019ed9bb08c9 WatchSource:0}: Error finding container 01fc66dfb4be1d8e2be72703dc49590cf5b550af8e70f9c96e76019ed9bb08c9: Status 404 returned error can't find the container with id 01fc66dfb4be1d8e2be72703dc49590cf5b550af8e70f9c96e76019ed9bb08c9 Apr 21 15:35:36.835502 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:36.835484 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf61a406_9ba7_4b2d_94b8_03e6d97a8118.slice/crio-c95cc726362106195fc64adc16548d7d09b3a15278b8a98a4e9e9f03376c206a WatchSource:0}: Error finding container c95cc726362106195fc64adc16548d7d09b3a15278b8a98a4e9e9f03376c206a: Status 404 returned error can't find the container with id c95cc726362106195fc64adc16548d7d09b3a15278b8a98a4e9e9f03376c206a Apr 21 15:35:36.859453 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.859432 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mz26j" Apr 21 15:35:36.865019 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.865001 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-crzt4" Apr 21 15:35:36.865648 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:36.865627 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb151d377_fb3e_44d5_a5e1_57bb572347d7.slice/crio-b58f5167cd11db02b8bc5b6e88bfe298020ba827f12188f2a4d96af5dece61a5 WatchSource:0}: Error finding container b58f5167cd11db02b8bc5b6e88bfe298020ba827f12188f2a4d96af5dece61a5: Status 404 returned error can't find the container with id b58f5167cd11db02b8bc5b6e88bfe298020ba827f12188f2a4d96af5dece61a5 Apr 21 15:35:36.870704 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.870683 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fj5g2" Apr 21 15:35:36.870906 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:36.870889 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7482d21_f4ae_4fe1_a26f_d1de8cd73926.slice/crio-e2e1e1455f230c3905fdfa14d7b4800edb385b48db3444e221457c9dcc3aeda0 WatchSource:0}: Error finding container e2e1e1455f230c3905fdfa14d7b4800edb385b48db3444e221457c9dcc3aeda0: Status 404 returned error can't find the container with id e2e1e1455f230c3905fdfa14d7b4800edb385b48db3444e221457c9dcc3aeda0 Apr 21 15:35:36.872093 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.872074 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" Apr 21 15:35:36.878055 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:36.878028 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod657382f2_3c88_4d85_b5cf_5533d6e4b19e.slice/crio-09c3a32d8ecec0d7639ec650e70d726493c803d52355bd9a613cd94cee223c7c WatchSource:0}: Error finding container 09c3a32d8ecec0d7639ec650e70d726493c803d52355bd9a613cd94cee223c7c: Status 404 returned error can't find the container with id 09c3a32d8ecec0d7639ec650e70d726493c803d52355bd9a613cd94cee223c7c Apr 21 15:35:36.878568 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:35:36.878547 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod205b70c4_2794_4a50_8e82_7285027e2f8d.slice/crio-304788cc94b7b108b8c366d952104249a1b96aab20de25fe41feefa23211a8ed WatchSource:0}: Error finding container 304788cc94b7b108b8c366d952104249a1b96aab20de25fe41feefa23211a8ed: Status 404 returned error can't find the container with id 304788cc94b7b108b8c366d952104249a1b96aab20de25fe41feefa23211a8ed Apr 21 15:35:36.879401 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:36.879382 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:37.078953 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:37.078841 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs\") pod \"network-metrics-daemon-x5zkt\" (UID: \"088bc7e8-4515-4c77-967b-a70ef32cd85e\") " pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:37.079113 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:37.079034 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:37.079113 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:37.079095 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs podName:088bc7e8-4515-4c77-967b-a70ef32cd85e nodeName:}" failed. No retries permitted until 2026-04-21 15:35:38.079075089 +0000 UTC m=+3.107241866 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs") pod "network-metrics-daemon-x5zkt" (UID: "088bc7e8-4515-4c77-967b-a70ef32cd85e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:37.179258 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:37.179224 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldhbb\" (UniqueName: \"kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb\") pod \"network-check-target-zmzmd\" (UID: \"f9899d71-8c55-4ec8-929c-ab8f3dcf09e9\") " pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:37.179462 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:37.179415 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:37.179462 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:37.179440 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:37.179462 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:37.179453 2579 projected.go:194] Error preparing data for projected volume kube-api-access-ldhbb for pod openshift-network-diagnostics/network-check-target-zmzmd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:37.179639 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:37.179511 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb podName:f9899d71-8c55-4ec8-929c-ab8f3dcf09e9 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:38.179491332 +0000 UTC m=+3.207658114 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldhbb" (UniqueName: "kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb") pod "network-check-target-zmzmd" (UID: "f9899d71-8c55-4ec8-929c-ab8f3dcf09e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:37.282529 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:37.282333 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:37.526919 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:37.526806 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:30:36 +0000 UTC" deadline="2027-09-26 12:53:59.34901763 +0000 UTC" Apr 21 15:35:37.526919 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:37.526841 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12549h18m21.822180383s" Apr 21 15:35:37.621962 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:37.618801 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" event={"ID":"205b70c4-2794-4a50-8e82-7285027e2f8d","Type":"ContainerStarted","Data":"304788cc94b7b108b8c366d952104249a1b96aab20de25fe41feefa23211a8ed"} Apr 21 15:35:37.624674 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:37.624607 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fj5g2" event={"ID":"657382f2-3c88-4d85-b5cf-5533d6e4b19e","Type":"ContainerStarted","Data":"09c3a32d8ecec0d7639ec650e70d726493c803d52355bd9a613cd94cee223c7c"} Apr 21 15:35:37.636373 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:37.636342 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mz26j" event={"ID":"b151d377-fb3e-44d5-a5e1-57bb572347d7","Type":"ContainerStarted","Data":"b58f5167cd11db02b8bc5b6e88bfe298020ba827f12188f2a4d96af5dece61a5"} Apr 21 15:35:37.643484 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:37.643409 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4v6vb" event={"ID":"6e5ace7b-a572-41f7-a65f-d0a88596a32b","Type":"ContainerStarted","Data":"01fc66dfb4be1d8e2be72703dc49590cf5b550af8e70f9c96e76019ed9bb08c9"} Apr 21 15:35:37.657476 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:37.657413 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tcxzq" event={"ID":"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d","Type":"ContainerStarted","Data":"2e0dc8af63f0f1318da34ac2c5be515cff7f8dd976fad57e76f4d351d384e62c"} Apr 21 15:35:37.660256 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:37.660192 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" event={"ID":"7b048749-149f-4a73-9f47-3ff1c3622ead","Type":"ContainerStarted","Data":"72061e7f4c9242f68e3d26aab9c45f493222a695abd5fb09f6a4afcafbcf7977"} Apr 21 15:35:37.669138 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:37.668972 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-crzt4" event={"ID":"a7482d21-f4ae-4fe1-a26f-d1de8cd73926","Type":"ContainerStarted","Data":"e2e1e1455f230c3905fdfa14d7b4800edb385b48db3444e221457c9dcc3aeda0"} Apr 21 15:35:37.681712 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:37.681685 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" event={"ID":"df61a406-9ba7-4b2d-94b8-03e6d97a8118","Type":"ContainerStarted","Data":"c95cc726362106195fc64adc16548d7d09b3a15278b8a98a4e9e9f03376c206a"} Apr 21 15:35:37.691903 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:37.691865 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2xzlk" event={"ID":"a396e4e6-5a05-450a-8a8c-263dd9674c34","Type":"ContainerStarted","Data":"910689c0142f09ed3d11a85997521ac925e4caa7030598469a07a1f06cbc6990"} Apr 21 15:35:37.701174 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:37.701151 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:38.086781 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:38.086742 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs\") pod \"network-metrics-daemon-x5zkt\" (UID: \"088bc7e8-4515-4c77-967b-a70ef32cd85e\") " pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:38.086993 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:38.086888 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:38.086993 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:38.086972 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs podName:088bc7e8-4515-4c77-967b-a70ef32cd85e nodeName:}" failed. No retries permitted until 2026-04-21 15:35:40.086935843 +0000 UTC m=+5.115102630 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs") pod "network-metrics-daemon-x5zkt" (UID: "088bc7e8-4515-4c77-967b-a70ef32cd85e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:38.187513 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:38.187475 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldhbb\" (UniqueName: \"kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb\") pod \"network-check-target-zmzmd\" (UID: \"f9899d71-8c55-4ec8-929c-ab8f3dcf09e9\") " pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:38.187708 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:38.187635 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:38.187708 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:38.187654 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:38.187708 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:38.187667 2579 projected.go:194] Error preparing data for projected volume kube-api-access-ldhbb for pod openshift-network-diagnostics/network-check-target-zmzmd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:38.187880 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:38.187730 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb podName:f9899d71-8c55-4ec8-929c-ab8f3dcf09e9 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:40.187709809 +0000 UTC m=+5.215876586 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldhbb" (UniqueName: "kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb") pod "network-check-target-zmzmd" (UID: "f9899d71-8c55-4ec8-929c-ab8f3dcf09e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:38.527210 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:38.527111 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:30:36 +0000 UTC" deadline="2027-12-02 08:18:51.163369866 +0000 UTC" Apr 21 15:35:38.527210 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:38.527151 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14152h43m12.63622177s" Apr 21 15:35:38.604218 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:38.604184 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:38.604394 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:38.604333 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5zkt" podUID="088bc7e8-4515-4c77-967b-a70ef32cd85e" Apr 21 15:35:38.604394 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:38.604178 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:38.604513 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:38.604440 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zmzmd" podUID="f9899d71-8c55-4ec8-929c-ab8f3dcf09e9" Apr 21 15:35:40.105250 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:40.105210 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs\") pod \"network-metrics-daemon-x5zkt\" (UID: \"088bc7e8-4515-4c77-967b-a70ef32cd85e\") " pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:40.105723 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:40.105397 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:40.105723 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:40.105517 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs podName:088bc7e8-4515-4c77-967b-a70ef32cd85e nodeName:}" failed. No retries permitted until 2026-04-21 15:35:44.105493428 +0000 UTC m=+9.133660204 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs") pod "network-metrics-daemon-x5zkt" (UID: "088bc7e8-4515-4c77-967b-a70ef32cd85e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:40.206376 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:40.206279 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldhbb\" (UniqueName: \"kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb\") pod \"network-check-target-zmzmd\" (UID: \"f9899d71-8c55-4ec8-929c-ab8f3dcf09e9\") " pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:40.206530 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:40.206472 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:40.206530 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:40.206497 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:40.206530 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:40.206511 2579 projected.go:194] Error preparing data for projected volume kube-api-access-ldhbb for pod openshift-network-diagnostics/network-check-target-zmzmd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:40.206644 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:40.206573 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb podName:f9899d71-8c55-4ec8-929c-ab8f3dcf09e9 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:44.206553103 +0000 UTC m=+9.234719880 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldhbb" (UniqueName: "kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb") pod "network-check-target-zmzmd" (UID: "f9899d71-8c55-4ec8-929c-ab8f3dcf09e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:40.605309 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:40.604576 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:40.605309 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:40.604716 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5zkt" podUID="088bc7e8-4515-4c77-967b-a70ef32cd85e" Apr 21 15:35:40.605309 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:40.605110 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:40.605309 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:40.605193 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zmzmd" podUID="f9899d71-8c55-4ec8-929c-ab8f3dcf09e9" Apr 21 15:35:42.603969 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:42.603849 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:42.604451 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:42.603984 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zmzmd" podUID="f9899d71-8c55-4ec8-929c-ab8f3dcf09e9" Apr 21 15:35:42.604451 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:42.604370 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:42.604560 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:42.604485 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5zkt" podUID="088bc7e8-4515-4c77-967b-a70ef32cd85e" Apr 21 15:35:44.137738 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:44.137655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs\") pod \"network-metrics-daemon-x5zkt\" (UID: \"088bc7e8-4515-4c77-967b-a70ef32cd85e\") " pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:44.138201 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:44.137798 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:44.138201 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:44.137859 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs podName:088bc7e8-4515-4c77-967b-a70ef32cd85e nodeName:}" failed. No retries permitted until 2026-04-21 15:35:52.1378406 +0000 UTC m=+17.166007387 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs") pod "network-metrics-daemon-x5zkt" (UID: "088bc7e8-4515-4c77-967b-a70ef32cd85e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:44.238375 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:44.238336 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldhbb\" (UniqueName: \"kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb\") pod \"network-check-target-zmzmd\" (UID: \"f9899d71-8c55-4ec8-929c-ab8f3dcf09e9\") " pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:44.238560 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:44.238518 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:44.238560 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:44.238537 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:44.238560 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:44.238547 2579 projected.go:194] Error preparing data for projected volume kube-api-access-ldhbb for pod openshift-network-diagnostics/network-check-target-zmzmd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:44.238720 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:44.238601 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb podName:f9899d71-8c55-4ec8-929c-ab8f3dcf09e9 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:52.238582851 +0000 UTC m=+17.266749644 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldhbb" (UniqueName: "kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb") pod "network-check-target-zmzmd" (UID: "f9899d71-8c55-4ec8-929c-ab8f3dcf09e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:44.604113 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:44.604081 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:44.604331 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:44.604211 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5zkt" podUID="088bc7e8-4515-4c77-967b-a70ef32cd85e" Apr 21 15:35:44.604331 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:44.604323 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:44.604499 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:44.604395 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zmzmd" podUID="f9899d71-8c55-4ec8-929c-ab8f3dcf09e9" Apr 21 15:35:46.604548 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:46.604518 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:46.604929 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:46.604652 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zmzmd" podUID="f9899d71-8c55-4ec8-929c-ab8f3dcf09e9" Apr 21 15:35:46.604929 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:46.604685 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:46.604929 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:46.604778 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5zkt" podUID="088bc7e8-4515-4c77-967b-a70ef32cd85e" Apr 21 15:35:48.603860 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:48.603829 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:48.604314 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:48.603824 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:48.604314 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:48.603962 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zmzmd" podUID="f9899d71-8c55-4ec8-929c-ab8f3dcf09e9" Apr 21 15:35:48.604314 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:48.604076 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5zkt" podUID="088bc7e8-4515-4c77-967b-a70ef32cd85e" Apr 21 15:35:50.604710 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:50.604680 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:50.605122 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:50.604680 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:50.605122 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:50.604823 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5zkt" podUID="088bc7e8-4515-4c77-967b-a70ef32cd85e" Apr 21 15:35:50.605122 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:50.604847 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zmzmd" podUID="f9899d71-8c55-4ec8-929c-ab8f3dcf09e9" Apr 21 15:35:52.192108 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:52.192077 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs\") pod \"network-metrics-daemon-x5zkt\" (UID: \"088bc7e8-4515-4c77-967b-a70ef32cd85e\") " pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:52.192628 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:52.192259 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:52.192628 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:52.192340 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs podName:088bc7e8-4515-4c77-967b-a70ef32cd85e nodeName:}" failed. No retries permitted until 2026-04-21 15:36:08.192318936 +0000 UTC m=+33.220485713 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs") pod "network-metrics-daemon-x5zkt" (UID: "088bc7e8-4515-4c77-967b-a70ef32cd85e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:52.292639 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:52.292601 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldhbb\" (UniqueName: \"kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb\") pod \"network-check-target-zmzmd\" (UID: \"f9899d71-8c55-4ec8-929c-ab8f3dcf09e9\") " pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:52.292826 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:52.292758 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:52.292826 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:52.292777 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:52.292826 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:52.292786 2579 projected.go:194] Error preparing data for projected volume kube-api-access-ldhbb for pod openshift-network-diagnostics/network-check-target-zmzmd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:52.292935 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:52.292838 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb podName:f9899d71-8c55-4ec8-929c-ab8f3dcf09e9 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:08.292820992 +0000 UTC m=+33.320987785 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldhbb" (UniqueName: "kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb") pod "network-check-target-zmzmd" (UID: "f9899d71-8c55-4ec8-929c-ab8f3dcf09e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:52.604150 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:52.604109 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:52.604447 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:52.604113 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:52.604447 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:52.604240 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zmzmd" podUID="f9899d71-8c55-4ec8-929c-ab8f3dcf09e9" Apr 21 15:35:52.604447 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:52.604285 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5zkt" podUID="088bc7e8-4515-4c77-967b-a70ef32cd85e" Apr 21 15:35:54.604402 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:54.604369 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:54.604777 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:54.604369 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:54.604777 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:54.604555 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zmzmd" podUID="f9899d71-8c55-4ec8-929c-ab8f3dcf09e9" Apr 21 15:35:54.604777 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:54.604468 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5zkt" podUID="088bc7e8-4515-4c77-967b-a70ef32cd85e" Apr 21 15:35:55.725244 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:55.724955 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tcxzq" event={"ID":"49b893d4-ef45-4e0b-9df7-4cda6b46fd3d","Type":"ContainerStarted","Data":"f738a4314b3605725f9dec218bb845be04e235bda6ad02ad9968f378129e02bb"} Apr 21 15:35:55.735962 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:55.735892 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/0.log" Apr 21 15:35:55.736728 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:55.736697 2579 generic.go:358] "Generic (PLEG): container finished" podID="df61a406-9ba7-4b2d-94b8-03e6d97a8118" containerID="59de133c60133969bc406944ad724f7fb4aef48e3d7615d7304a03666cc0e0ae" exitCode=1 Apr 21 15:35:55.736834 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:55.736743 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" event={"ID":"df61a406-9ba7-4b2d-94b8-03e6d97a8118","Type":"ContainerStarted","Data":"c786024ce556e4b5508c239960f7e1cceea8e1c4c73fe151fffb5e51280ace77"} Apr 21 15:35:55.736834 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:55.736770 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" event={"ID":"df61a406-9ba7-4b2d-94b8-03e6d97a8118","Type":"ContainerStarted","Data":"f08d9a8e0d728d9fd62af0ecb0784ce170f1a54fdeb0ee670aae8b543a81831c"} Apr 21 15:35:55.736834 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:55.736784 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" event={"ID":"df61a406-9ba7-4b2d-94b8-03e6d97a8118","Type":"ContainerStarted","Data":"e734726b57089d14ccc250ff648a45ee7dd1e90f9e1be0e4d5cf89a56203f137"} Apr 21 15:35:55.736834 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:55.736799 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" event={"ID":"df61a406-9ba7-4b2d-94b8-03e6d97a8118","Type":"ContainerStarted","Data":"09176da7c2940562e47e3adf176e5aac7173320f8439c8661306ba071b1ddf10"} Apr 21 15:35:55.736834 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:55.736811 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" event={"ID":"df61a406-9ba7-4b2d-94b8-03e6d97a8118","Type":"ContainerDied","Data":"59de133c60133969bc406944ad724f7fb4aef48e3d7615d7304a03666cc0e0ae"} Apr 21 15:35:55.736834 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:55.736822 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" event={"ID":"df61a406-9ba7-4b2d-94b8-03e6d97a8118","Type":"ContainerStarted","Data":"1f333cc4847caf13e66a98424c68a19b6aa32a75988ec56b24a45afc7a8d3576"} Apr 21 15:35:55.738594 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:55.738569 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-141.ec2.internal" event={"ID":"546d5bb4e1c933cd7732e2f27360ece8","Type":"ContainerStarted","Data":"88c34504c7c49d6a004abed1b6773059a64479fbd7ff416086647988258e0f99"} Apr 21 15:35:55.740319 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:55.740281 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" event={"ID":"205b70c4-2794-4a50-8e82-7285027e2f8d","Type":"ContainerStarted","Data":"c3bc0da42160a8ea68dbeadb0a4f0baef12fad3c7d74827e7d989d367d3b2056"} Apr 21 15:35:55.744658 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:55.744588 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tcxzq" podStartSLOduration=2.534163863 podStartE2EDuration="20.744562014s" podCreationTimestamp="2026-04-21 15:35:35 +0000 UTC" firstStartedPulling="2026-04-21 15:35:36.819187822 +0000 UTC m=+1.847354610" lastFinishedPulling="2026-04-21 15:35:55.029585977 +0000 UTC m=+20.057752761" observedRunningTime="2026-04-21 15:35:55.743820801 +0000 UTC m=+20.771987596" watchObservedRunningTime="2026-04-21 15:35:55.744562014 +0000 UTC m=+20.772728804" Apr 21 15:35:55.762333 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:55.762294 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bgm7g" podStartSLOduration=1.611161557 podStartE2EDuration="19.76228021s" podCreationTimestamp="2026-04-21 15:35:36 +0000 UTC" firstStartedPulling="2026-04-21 15:35:36.880062486 +0000 UTC m=+1.908229260" lastFinishedPulling="2026-04-21 15:35:55.031181127 +0000 UTC m=+20.059347913" observedRunningTime="2026-04-21 15:35:55.762017502 +0000 UTC m=+20.790184297" watchObservedRunningTime="2026-04-21 15:35:55.76228021 +0000 UTC m=+20.790447006" Apr 21 15:35:55.782608 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:55.782561 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-141.ec2.internal" podStartSLOduration=19.782546139 podStartE2EDuration="19.782546139s" podCreationTimestamp="2026-04-21 15:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:35:55.782005846 +0000 UTC m=+20.810172641" watchObservedRunningTime="2026-04-21 15:35:55.782546139 +0000 UTC m=+20.810712914" Apr 21 15:35:56.604581 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:56.604554 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:56.604695 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:56.604554 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:56.604695 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:56.604651 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5zkt" podUID="088bc7e8-4515-4c77-967b-a70ef32cd85e" Apr 21 15:35:56.604762 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:56.604719 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zmzmd" podUID="f9899d71-8c55-4ec8-929c-ab8f3dcf09e9" Apr 21 15:35:56.745602 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:56.745534 2579 generic.go:358] "Generic (PLEG): container finished" podID="ceaff6e191396af72f395bd243596937" containerID="78c3bcb3248169a984899066c52664da9f10aed1013ea64375344a4e3f01f41d" exitCode=0 Apr 21 15:35:56.745974 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:56.745794 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal" event={"ID":"ceaff6e191396af72f395bd243596937","Type":"ContainerDied","Data":"78c3bcb3248169a984899066c52664da9f10aed1013ea64375344a4e3f01f41d"} Apr 21 15:35:56.747310 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:56.747283 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fj5g2" event={"ID":"657382f2-3c88-4d85-b5cf-5533d6e4b19e","Type":"ContainerStarted","Data":"6c0318bdf800620025ebc6dbe1f4968c2b104d2e821b671b3862cd67d7a5250f"} Apr 21 15:35:56.749299 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:56.749273 2579 generic.go:358] "Generic (PLEG): container finished" podID="b151d377-fb3e-44d5-a5e1-57bb572347d7" containerID="64fa212e4fe7a5f372314c7e539f9850389ff3932045c3968af4b58b3845f4c7" exitCode=0 Apr 21 15:35:56.749410 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:56.749338 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mz26j" event={"ID":"b151d377-fb3e-44d5-a5e1-57bb572347d7","Type":"ContainerDied","Data":"64fa212e4fe7a5f372314c7e539f9850389ff3932045c3968af4b58b3845f4c7"} Apr 21 15:35:56.751568 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:56.751536 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4v6vb" event={"ID":"6e5ace7b-a572-41f7-a65f-d0a88596a32b","Type":"ContainerStarted","Data":"eb4b9119ac0617ba11d512ab51a14722e002262bce9659e6e2066a891f89e799"} Apr 21 15:35:56.752838 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:56.752815 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" event={"ID":"7b048749-149f-4a73-9f47-3ff1c3622ead","Type":"ContainerStarted","Data":"185aa325e9d91dce55eecc32958d57efe6f3aa2b8a19ea6394c6930086ae1301"} Apr 21 15:35:56.754809 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:56.754442 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-crzt4" event={"ID":"a7482d21-f4ae-4fe1-a26f-d1de8cd73926","Type":"ContainerStarted","Data":"7c89df204739b087c0c9f89b4fde53b86e79b99285485f1bf2606008ae41d6a4"} Apr 21 15:35:56.756141 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:56.756119 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2xzlk" event={"ID":"a396e4e6-5a05-450a-8a8c-263dd9674c34","Type":"ContainerStarted","Data":"20fd1a50dd334522c3861b5f1b31bbae333050100658aab3811df2d36728accd"} Apr 21 15:35:56.814340 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:56.814286 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4v6vb" podStartSLOduration=3.616212698 podStartE2EDuration="21.814270121s" podCreationTimestamp="2026-04-21 15:35:35 +0000 UTC" firstStartedPulling="2026-04-21 15:35:36.83273726 +0000 UTC m=+1.860904034" lastFinishedPulling="2026-04-21 15:35:55.03079467 +0000 UTC m=+20.058961457" observedRunningTime="2026-04-21 15:35:56.788010355 +0000 UTC m=+21.816177152" watchObservedRunningTime="2026-04-21 15:35:56.814270121 +0000 UTC m=+21.842436917" Apr 21 15:35:56.858465 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:56.858419 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-crzt4" podStartSLOduration=3.741751266 podStartE2EDuration="21.858401263s" podCreationTimestamp="2026-04-21 15:35:35 +0000 UTC" firstStartedPulling="2026-04-21 15:35:36.873634465 +0000 UTC m=+1.901801253" lastFinishedPulling="2026-04-21 15:35:54.990284464 +0000 UTC m=+20.018451250" observedRunningTime="2026-04-21 15:35:56.84063201 +0000 UTC m=+21.868798806" watchObservedRunningTime="2026-04-21 15:35:56.858401263 +0000 UTC m=+21.886568060" Apr 21 15:35:56.883145 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:56.883097 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2xzlk" podStartSLOduration=3.688469902 podStartE2EDuration="21.883082617s" podCreationTimestamp="2026-04-21 15:35:35 +0000 UTC" firstStartedPulling="2026-04-21 15:35:36.795643054 +0000 UTC m=+1.823809828" lastFinishedPulling="2026-04-21 15:35:54.990255764 +0000 UTC m=+20.018422543" observedRunningTime="2026-04-21 15:35:56.882998762 +0000 UTC m=+21.911165557" watchObservedRunningTime="2026-04-21 15:35:56.883082617 +0000 UTC m=+21.911249415" Apr 21 15:35:56.883657 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:56.883617 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fj5g2" podStartSLOduration=2.7734296130000002 podStartE2EDuration="20.88360213s" podCreationTimestamp="2026-04-21 15:35:36 +0000 UTC" firstStartedPulling="2026-04-21 15:35:36.880068022 +0000 UTC m=+1.908234796" lastFinishedPulling="2026-04-21 15:35:54.990240531 +0000 UTC m=+20.018407313" observedRunningTime="2026-04-21 15:35:56.858534502 +0000 UTC m=+21.886701292" watchObservedRunningTime="2026-04-21 15:35:56.88360213 +0000 UTC m=+21.911768927" Apr 21 15:35:57.343497 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:57.343461 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 15:35:57.537763 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:57.537592 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T15:35:57.343487366Z","UUID":"b73a1850-7c7b-483c-b376-b3d4d69a7499","Handler":null,"Name":"","Endpoint":""} Apr 21 15:35:57.540874 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:57.540852 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 15:35:57.541012 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:57.540886 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 15:35:57.760271 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:57.760193 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal" event={"ID":"ceaff6e191396af72f395bd243596937","Type":"ContainerStarted","Data":"47fe3e09548eedaa007c311791422a274df74a0eaf282994b6a63138561669a3"} Apr 21 15:35:57.761982 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:57.761933 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" event={"ID":"7b048749-149f-4a73-9f47-3ff1c3622ead","Type":"ContainerStarted","Data":"208fccde9512bb106cf1092c4ac7cd2e35e61b3796ccf2695e320efe0f0e6577"} Apr 21 15:35:57.783844 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:57.783664 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-141.ec2.internal" podStartSLOduration=21.783652619 podStartE2EDuration="21.783652619s" podCreationTimestamp="2026-04-21 15:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:35:57.783650847 +0000 UTC m=+22.811817644" watchObservedRunningTime="2026-04-21 15:35:57.783652619 +0000 UTC m=+22.811819414" Apr 21 15:35:58.604044 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:58.604010 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:35:58.604244 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:58.604010 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:35:58.604244 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:58.604139 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5zkt" podUID="088bc7e8-4515-4c77-967b-a70ef32cd85e" Apr 21 15:35:58.604244 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:35:58.604204 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zmzmd" podUID="f9899d71-8c55-4ec8-929c-ab8f3dcf09e9" Apr 21 15:35:58.766821 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:58.766781 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/0.log" Apr 21 15:35:58.767289 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:58.767217 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" event={"ID":"df61a406-9ba7-4b2d-94b8-03e6d97a8118","Type":"ContainerStarted","Data":"4870222b2978a14b6b282db0363dc0cb3ef0beed2e4acdf510df5d737a982a11"} Apr 21 15:35:58.769358 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:58.769326 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" event={"ID":"7b048749-149f-4a73-9f47-3ff1c3622ead","Type":"ContainerStarted","Data":"21a97eec0d5eeb7f3f5a8d9c12f452a68635545ae991a907f3b57677cd63f1bc"} Apr 21 15:35:58.790401 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:58.790354 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pt24f" podStartSLOduration=2.453025248 podStartE2EDuration="23.790337463s" podCreationTimestamp="2026-04-21 15:35:35 +0000 UTC" firstStartedPulling="2026-04-21 15:35:36.800502255 +0000 UTC m=+1.828669028" lastFinishedPulling="2026-04-21 15:35:58.137814453 +0000 UTC m=+23.165981243" observedRunningTime="2026-04-21 15:35:58.78966793 +0000 UTC m=+23.817834728" watchObservedRunningTime="2026-04-21 15:35:58.790337463 +0000 UTC m=+23.818504260" Apr 21 15:35:58.839900 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:58.839863 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-crzt4" Apr 21 15:35:58.840646 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:58.840624 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-crzt4" Apr 21 15:35:58.993051 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:58.992971 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-crzt4" Apr 21 15:35:58.993545 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:35:58.993504 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-crzt4" Apr 21 15:36:00.604621 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:00.604594 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:36:00.604621 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:00.604616 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:36:00.605351 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:00.604703 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zmzmd" podUID="f9899d71-8c55-4ec8-929c-ab8f3dcf09e9" Apr 21 15:36:00.605351 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:00.604849 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5zkt" podUID="088bc7e8-4515-4c77-967b-a70ef32cd85e" Apr 21 15:36:01.776623 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:01.776379 2579 generic.go:358] "Generic (PLEG): container finished" podID="b151d377-fb3e-44d5-a5e1-57bb572347d7" containerID="3b89059f92ba62771d3830ec129041d8444ea48fbf6573257d431e26ff4c773b" exitCode=0 Apr 21 15:36:01.777192 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:01.776460 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mz26j" event={"ID":"b151d377-fb3e-44d5-a5e1-57bb572347d7","Type":"ContainerDied","Data":"3b89059f92ba62771d3830ec129041d8444ea48fbf6573257d431e26ff4c773b"} Apr 21 15:36:01.779772 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:01.779753 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/0.log" Apr 21 15:36:01.780097 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:01.780074 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" event={"ID":"df61a406-9ba7-4b2d-94b8-03e6d97a8118","Type":"ContainerStarted","Data":"a292a51c8157041c9fc7084b9eeae643f6e6b91c8e8c65ba6dc7898017d83fdd"} Apr 21 15:36:01.780379 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:01.780359 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:36:01.780476 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:01.780387 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:36:01.780476 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:01.780397 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:36:01.780563 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:01.780547 2579 scope.go:117] "RemoveContainer" containerID="59de133c60133969bc406944ad724f7fb4aef48e3d7615d7304a03666cc0e0ae" Apr 21 15:36:01.795796 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:01.795772 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:36:01.796324 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:01.796306 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:36:02.604884 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:02.604852 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:36:02.605257 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:02.604852 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:36:02.605257 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:02.604988 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zmzmd" podUID="f9899d71-8c55-4ec8-929c-ab8f3dcf09e9" Apr 21 15:36:02.605257 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:02.605033 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5zkt" podUID="088bc7e8-4515-4c77-967b-a70ef32cd85e" Apr 21 15:36:02.786889 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:02.786864 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/0.log" Apr 21 15:36:02.787862 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:02.787827 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" event={"ID":"df61a406-9ba7-4b2d-94b8-03e6d97a8118","Type":"ContainerStarted","Data":"a725933766875aec2bd84d27c739eb498a3f21d6415e0ed0c3c1e3b0416dde40"} Apr 21 15:36:02.846168 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:02.845894 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" podStartSLOduration=9.605842894 podStartE2EDuration="27.845878993s" podCreationTimestamp="2026-04-21 15:35:35 +0000 UTC" firstStartedPulling="2026-04-21 15:35:36.837036608 +0000 UTC m=+1.865203385" lastFinishedPulling="2026-04-21 15:35:55.077072711 +0000 UTC m=+20.105239484" observedRunningTime="2026-04-21 15:36:02.843243507 +0000 UTC m=+27.871410303" watchObservedRunningTime="2026-04-21 15:36:02.845878993 +0000 UTC m=+27.874045788" Apr 21 15:36:03.066854 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:03.066690 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x5zkt"] Apr 21 15:36:03.067034 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:03.066977 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:36:03.067118 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:03.067091 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5zkt" podUID="088bc7e8-4515-4c77-967b-a70ef32cd85e" Apr 21 15:36:03.068711 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:03.068679 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zmzmd"] Apr 21 15:36:03.068827 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:03.068781 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:36:03.068894 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:03.068867 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zmzmd" podUID="f9899d71-8c55-4ec8-929c-ab8f3dcf09e9" Apr 21 15:36:03.791322 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:03.791292 2579 generic.go:358] "Generic (PLEG): container finished" podID="b151d377-fb3e-44d5-a5e1-57bb572347d7" containerID="750860cac28563e1127082e6190682d2e633d9940c82a916736d9e46fdf569ba" exitCode=0 Apr 21 15:36:03.791855 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:03.791371 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mz26j" event={"ID":"b151d377-fb3e-44d5-a5e1-57bb572347d7","Type":"ContainerDied","Data":"750860cac28563e1127082e6190682d2e633d9940c82a916736d9e46fdf569ba"} Apr 21 15:36:04.604812 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:04.604776 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:36:04.604997 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:04.604888 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5zkt" podUID="088bc7e8-4515-4c77-967b-a70ef32cd85e" Apr 21 15:36:04.604997 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:04.604969 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:36:04.605120 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:04.605083 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zmzmd" podUID="f9899d71-8c55-4ec8-929c-ab8f3dcf09e9" Apr 21 15:36:05.796754 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:05.796720 2579 generic.go:358] "Generic (PLEG): container finished" podID="b151d377-fb3e-44d5-a5e1-57bb572347d7" containerID="607990631cbca3da35a282d0933ec0ca7441f80889618c64fcf8c2182651c235" exitCode=0 Apr 21 15:36:05.797306 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:05.796783 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mz26j" event={"ID":"b151d377-fb3e-44d5-a5e1-57bb572347d7","Type":"ContainerDied","Data":"607990631cbca3da35a282d0933ec0ca7441f80889618c64fcf8c2182651c235"} Apr 21 15:36:06.604075 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:06.604040 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:36:06.604075 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:06.604068 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:36:06.604314 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:06.604154 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zmzmd" podUID="f9899d71-8c55-4ec8-929c-ab8f3dcf09e9" Apr 21 15:36:06.604356 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:06.604306 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5zkt" podUID="088bc7e8-4515-4c77-967b-a70ef32cd85e" Apr 21 15:36:08.209176 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.209084 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs\") pod \"network-metrics-daemon-x5zkt\" (UID: \"088bc7e8-4515-4c77-967b-a70ef32cd85e\") " pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:36:08.209628 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:08.209266 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:36:08.209628 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:08.209327 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs podName:088bc7e8-4515-4c77-967b-a70ef32cd85e nodeName:}" failed. No retries permitted until 2026-04-21 15:36:40.209310122 +0000 UTC m=+65.237476895 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs") pod "network-metrics-daemon-x5zkt" (UID: "088bc7e8-4515-4c77-967b-a70ef32cd85e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:36:08.309623 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.309586 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldhbb\" (UniqueName: \"kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb\") pod \"network-check-target-zmzmd\" (UID: \"f9899d71-8c55-4ec8-929c-ab8f3dcf09e9\") " pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:36:08.309806 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:08.309750 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:36:08.309806 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:08.309774 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:36:08.309806 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:08.309789 2579 projected.go:194] Error preparing data for projected volume kube-api-access-ldhbb for pod openshift-network-diagnostics/network-check-target-zmzmd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:36:08.309992 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:08.309856 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb podName:f9899d71-8c55-4ec8-929c-ab8f3dcf09e9 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:40.309840454 +0000 UTC m=+65.338007228 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldhbb" (UniqueName: "kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb") pod "network-check-target-zmzmd" (UID: "f9899d71-8c55-4ec8-929c-ab8f3dcf09e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:36:08.313426 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.313395 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-141.ec2.internal" event="NodeReady" Apr 21 15:36:08.313546 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.313530 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 15:36:08.378691 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.378656 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dgpm6"] Apr 21 15:36:08.394955 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.394918 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-klb4g"] Apr 21 15:36:08.395962 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.395468 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:08.400189 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.398851 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 15:36:08.400189 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.399409 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7ld2z\"" Apr 21 15:36:08.400189 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.399408 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 15:36:08.417878 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.417794 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dgpm6"] Apr 21 15:36:08.417878 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.417885 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-klb4g"] Apr 21 15:36:08.418110 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.417981 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-klb4g" Apr 21 15:36:08.421219 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.421192 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 15:36:08.421524 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.421503 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 15:36:08.423783 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.423761 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 15:36:08.424469 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.424380 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xkwb5\"" Apr 21 15:36:08.511734 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.511652 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98l42\" (UniqueName: \"kubernetes.io/projected/dff5e891-a3c3-4526-94e0-f1c91d517e9d-kube-api-access-98l42\") pod \"ingress-canary-klb4g\" (UID: \"dff5e891-a3c3-4526-94e0-f1c91d517e9d\") " pod="openshift-ingress-canary/ingress-canary-klb4g" Apr 21 15:36:08.511734 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.511691 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:08.511734 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.511720 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnb69\" (UniqueName: \"kubernetes.io/projected/704d38f9-6323-48bf-b8f7-977c83275b82-kube-api-access-mnb69\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:08.512000 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.511791 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert\") pod \"ingress-canary-klb4g\" (UID: \"dff5e891-a3c3-4526-94e0-f1c91d517e9d\") " pod="openshift-ingress-canary/ingress-canary-klb4g" Apr 21 15:36:08.512000 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.511839 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/704d38f9-6323-48bf-b8f7-977c83275b82-config-volume\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:08.512000 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.511877 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/704d38f9-6323-48bf-b8f7-977c83275b82-tmp-dir\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:08.603978 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.603924 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:36:08.604171 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.603926 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:36:08.612472 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.612446 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98l42\" (UniqueName: \"kubernetes.io/projected/dff5e891-a3c3-4526-94e0-f1c91d517e9d-kube-api-access-98l42\") pod \"ingress-canary-klb4g\" (UID: \"dff5e891-a3c3-4526-94e0-f1c91d517e9d\") " pod="openshift-ingress-canary/ingress-canary-klb4g" Apr 21 15:36:08.612605 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.612487 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:08.612605 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.612518 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnb69\" (UniqueName: \"kubernetes.io/projected/704d38f9-6323-48bf-b8f7-977c83275b82-kube-api-access-mnb69\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:08.612605 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.612547 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert\") pod \"ingress-canary-klb4g\" (UID: \"dff5e891-a3c3-4526-94e0-f1c91d517e9d\") " pod="openshift-ingress-canary/ingress-canary-klb4g" Apr 21 15:36:08.612605 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.612575 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/704d38f9-6323-48bf-b8f7-977c83275b82-config-volume\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:08.612811 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.612616 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/704d38f9-6323-48bf-b8f7-977c83275b82-tmp-dir\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:08.613062 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.613041 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/704d38f9-6323-48bf-b8f7-977c83275b82-tmp-dir\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:08.613157 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:08.613141 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:36:08.613210 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:08.613203 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert podName:dff5e891-a3c3-4526-94e0-f1c91d517e9d nodeName:}" failed. No retries permitted until 2026-04-21 15:36:09.113185152 +0000 UTC m=+34.141351926 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert") pod "ingress-canary-klb4g" (UID: "dff5e891-a3c3-4526-94e0-f1c91d517e9d") : secret "canary-serving-cert" not found Apr 21 15:36:08.613430 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:08.613413 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:36:08.613490 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:08.613460 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls podName:704d38f9-6323-48bf-b8f7-977c83275b82 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:09.113446987 +0000 UTC m=+34.141613764 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls") pod "dns-default-dgpm6" (UID: "704d38f9-6323-48bf-b8f7-977c83275b82") : secret "dns-default-metrics-tls" not found Apr 21 15:36:08.613959 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.613919 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/704d38f9-6323-48bf-b8f7-977c83275b82-config-volume\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:08.615817 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.615799 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 15:36:08.615922 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.615887 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sjbkh\"" Apr 21 15:36:08.615922 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.615911 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 15:36:08.617161 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.617145 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 15:36:08.620429 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.620412 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ndcm7\"" Apr 21 15:36:08.629229 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.629203 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98l42\" (UniqueName: \"kubernetes.io/projected/dff5e891-a3c3-4526-94e0-f1c91d517e9d-kube-api-access-98l42\") pod \"ingress-canary-klb4g\" (UID: \"dff5e891-a3c3-4526-94e0-f1c91d517e9d\") " pod="openshift-ingress-canary/ingress-canary-klb4g" Apr 21 15:36:08.629516 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:08.629495 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnb69\" (UniqueName: \"kubernetes.io/projected/704d38f9-6323-48bf-b8f7-977c83275b82-kube-api-access-mnb69\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:09.115641 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:09.115604 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:09.115641 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:09.115641 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert\") pod \"ingress-canary-klb4g\" (UID: \"dff5e891-a3c3-4526-94e0-f1c91d517e9d\") " pod="openshift-ingress-canary/ingress-canary-klb4g" Apr 21 15:36:09.115890 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:09.115743 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:36:09.115890 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:09.115772 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:36:09.115890 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:09.115802 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert podName:dff5e891-a3c3-4526-94e0-f1c91d517e9d nodeName:}" failed. No retries permitted until 2026-04-21 15:36:10.115783181 +0000 UTC m=+35.143949957 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert") pod "ingress-canary-klb4g" (UID: "dff5e891-a3c3-4526-94e0-f1c91d517e9d") : secret "canary-serving-cert" not found Apr 21 15:36:09.115890 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:09.115835 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls podName:704d38f9-6323-48bf-b8f7-977c83275b82 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:10.115815047 +0000 UTC m=+35.143981835 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls") pod "dns-default-dgpm6" (UID: "704d38f9-6323-48bf-b8f7-977c83275b82") : secret "dns-default-metrics-tls" not found Apr 21 15:36:10.123201 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:10.123167 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:10.123201 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:10.123201 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert\") pod \"ingress-canary-klb4g\" (UID: \"dff5e891-a3c3-4526-94e0-f1c91d517e9d\") " pod="openshift-ingress-canary/ingress-canary-klb4g" Apr 21 15:36:10.123681 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:10.123314 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:36:10.123681 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:10.123320 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:36:10.123681 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:10.123371 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert podName:dff5e891-a3c3-4526-94e0-f1c91d517e9d nodeName:}" failed. No retries permitted until 2026-04-21 15:36:12.12335368 +0000 UTC m=+37.151520459 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert") pod "ingress-canary-klb4g" (UID: "dff5e891-a3c3-4526-94e0-f1c91d517e9d") : secret "canary-serving-cert" not found Apr 21 15:36:10.123681 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:10.123385 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls podName:704d38f9-6323-48bf-b8f7-977c83275b82 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:12.123379075 +0000 UTC m=+37.151545849 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls") pod "dns-default-dgpm6" (UID: "704d38f9-6323-48bf-b8f7-977c83275b82") : secret "dns-default-metrics-tls" not found Apr 21 15:36:11.810029 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:11.809994 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mz26j" event={"ID":"b151d377-fb3e-44d5-a5e1-57bb572347d7","Type":"ContainerStarted","Data":"0156e9cfb6cd74f6b412bfd231fff6b4f2cc64d753aa5e88071a9bccb8856d6f"} Apr 21 15:36:12.137827 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:12.137794 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:12.137827 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:12.137827 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert\") pod \"ingress-canary-klb4g\" (UID: \"dff5e891-a3c3-4526-94e0-f1c91d517e9d\") " pod="openshift-ingress-canary/ingress-canary-klb4g" Apr 21 15:36:12.138008 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:12.137956 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:36:12.138008 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:12.137969 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:36:12.138008 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:12.138005 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert podName:dff5e891-a3c3-4526-94e0-f1c91d517e9d nodeName:}" failed. No retries permitted until 2026-04-21 15:36:16.137991381 +0000 UTC m=+41.166158154 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert") pod "ingress-canary-klb4g" (UID: "dff5e891-a3c3-4526-94e0-f1c91d517e9d") : secret "canary-serving-cert" not found Apr 21 15:36:12.138117 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:12.138034 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls podName:704d38f9-6323-48bf-b8f7-977c83275b82 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:16.138018168 +0000 UTC m=+41.166184942 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls") pod "dns-default-dgpm6" (UID: "704d38f9-6323-48bf-b8f7-977c83275b82") : secret "dns-default-metrics-tls" not found Apr 21 15:36:12.814445 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:12.814359 2579 generic.go:358] "Generic (PLEG): container finished" podID="b151d377-fb3e-44d5-a5e1-57bb572347d7" containerID="0156e9cfb6cd74f6b412bfd231fff6b4f2cc64d753aa5e88071a9bccb8856d6f" exitCode=0 Apr 21 15:36:12.814445 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:12.814416 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mz26j" event={"ID":"b151d377-fb3e-44d5-a5e1-57bb572347d7","Type":"ContainerDied","Data":"0156e9cfb6cd74f6b412bfd231fff6b4f2cc64d753aa5e88071a9bccb8856d6f"} Apr 21 15:36:13.818790 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:13.818754 2579 generic.go:358] "Generic (PLEG): container finished" podID="b151d377-fb3e-44d5-a5e1-57bb572347d7" containerID="e238e0fc06c11b16feb74b83c6fd70b1e40b7f2c28f31953f4a6c803bd04cf70" exitCode=0 Apr 21 15:36:13.819249 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:13.818810 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mz26j" event={"ID":"b151d377-fb3e-44d5-a5e1-57bb572347d7","Type":"ContainerDied","Data":"e238e0fc06c11b16feb74b83c6fd70b1e40b7f2c28f31953f4a6c803bd04cf70"} Apr 21 15:36:14.824112 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:14.824078 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mz26j" event={"ID":"b151d377-fb3e-44d5-a5e1-57bb572347d7","Type":"ContainerStarted","Data":"a5f83fc974ba545d93ba597fe96f1427082837c422683b24b25a20e7327f8ab8"} Apr 21 15:36:14.848268 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:14.848215 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mz26j" podStartSLOduration=5.077792932 podStartE2EDuration="39.848202023s" podCreationTimestamp="2026-04-21 15:35:35 +0000 UTC" firstStartedPulling="2026-04-21 15:35:36.867907831 +0000 UTC m=+1.896074606" lastFinishedPulling="2026-04-21 15:36:11.638316921 +0000 UTC m=+36.666483697" observedRunningTime="2026-04-21 15:36:14.846821803 +0000 UTC m=+39.874988599" watchObservedRunningTime="2026-04-21 15:36:14.848202023 +0000 UTC m=+39.876368855" Apr 21 15:36:16.163669 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:16.163631 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:16.163669 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:16.163671 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert\") pod \"ingress-canary-klb4g\" (UID: \"dff5e891-a3c3-4526-94e0-f1c91d517e9d\") " pod="openshift-ingress-canary/ingress-canary-klb4g" Apr 21 15:36:16.164285 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:16.163761 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:36:16.164285 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:16.163768 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:36:16.164285 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:16.163821 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls podName:704d38f9-6323-48bf-b8f7-977c83275b82 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:24.163803776 +0000 UTC m=+49.191970549 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls") pod "dns-default-dgpm6" (UID: "704d38f9-6323-48bf-b8f7-977c83275b82") : secret "dns-default-metrics-tls" not found Apr 21 15:36:16.164285 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:16.163835 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert podName:dff5e891-a3c3-4526-94e0-f1c91d517e9d nodeName:}" failed. No retries permitted until 2026-04-21 15:36:24.163828856 +0000 UTC m=+49.191995630 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert") pod "ingress-canary-klb4g" (UID: "dff5e891-a3c3-4526-94e0-f1c91d517e9d") : secret "canary-serving-cert" not found Apr 21 15:36:24.222334 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:24.222290 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:24.222334 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:24.222337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert\") pod \"ingress-canary-klb4g\" (UID: \"dff5e891-a3c3-4526-94e0-f1c91d517e9d\") " pod="openshift-ingress-canary/ingress-canary-klb4g" Apr 21 15:36:24.222886 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:24.222429 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:36:24.222886 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:24.222441 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:36:24.222886 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:24.222490 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert podName:dff5e891-a3c3-4526-94e0-f1c91d517e9d nodeName:}" failed. No retries permitted until 2026-04-21 15:36:40.222475402 +0000 UTC m=+65.250642177 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert") pod "ingress-canary-klb4g" (UID: "dff5e891-a3c3-4526-94e0-f1c91d517e9d") : secret "canary-serving-cert" not found Apr 21 15:36:24.222886 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:24.222575 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls podName:704d38f9-6323-48bf-b8f7-977c83275b82 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:40.222543318 +0000 UTC m=+65.250710095 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls") pod "dns-default-dgpm6" (UID: "704d38f9-6323-48bf-b8f7-977c83275b82") : secret "dns-default-metrics-tls" not found Apr 21 15:36:33.810080 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:33.810043 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7tbgn" Apr 21 15:36:40.231623 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:40.231587 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert\") pod \"ingress-canary-klb4g\" (UID: \"dff5e891-a3c3-4526-94e0-f1c91d517e9d\") " pod="openshift-ingress-canary/ingress-canary-klb4g" Apr 21 15:36:40.232144 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:40.231640 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs\") pod \"network-metrics-daemon-x5zkt\" (UID: \"088bc7e8-4515-4c77-967b-a70ef32cd85e\") " pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:36:40.232144 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:40.231699 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:36:40.232144 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:40.231734 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:36:40.232144 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:40.231775 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:36:40.232144 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:40.231808 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert podName:dff5e891-a3c3-4526-94e0-f1c91d517e9d nodeName:}" failed. No retries permitted until 2026-04-21 15:37:12.231780732 +0000 UTC m=+97.259947509 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert") pod "ingress-canary-klb4g" (UID: "dff5e891-a3c3-4526-94e0-f1c91d517e9d") : secret "canary-serving-cert" not found Apr 21 15:36:40.232144 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:40.231827 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls podName:704d38f9-6323-48bf-b8f7-977c83275b82 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:12.231818382 +0000 UTC m=+97.259985156 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls") pod "dns-default-dgpm6" (UID: "704d38f9-6323-48bf-b8f7-977c83275b82") : secret "dns-default-metrics-tls" not found Apr 21 15:36:40.234802 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:40.234783 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 15:36:40.241910 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:40.241891 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 15:36:40.242001 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:36:40.241936 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs podName:088bc7e8-4515-4c77-967b-a70ef32cd85e nodeName:}" failed. No retries permitted until 2026-04-21 15:37:44.24192281 +0000 UTC m=+129.270089585 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs") pod "network-metrics-daemon-x5zkt" (UID: "088bc7e8-4515-4c77-967b-a70ef32cd85e") : secret "metrics-daemon-secret" not found Apr 21 15:36:40.332725 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:40.332696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldhbb\" (UniqueName: \"kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb\") pod \"network-check-target-zmzmd\" (UID: \"f9899d71-8c55-4ec8-929c-ab8f3dcf09e9\") " pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:36:40.335889 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:40.335871 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 15:36:40.345509 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:40.345492 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 15:36:40.357004 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:40.356984 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldhbb\" (UniqueName: \"kubernetes.io/projected/f9899d71-8c55-4ec8-929c-ab8f3dcf09e9-kube-api-access-ldhbb\") pod \"network-check-target-zmzmd\" (UID: \"f9899d71-8c55-4ec8-929c-ab8f3dcf09e9\") " pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:36:40.425619 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:40.425594 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ndcm7\"" Apr 21 15:36:40.433046 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:40.433027 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:36:40.618855 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:40.618828 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zmzmd"] Apr 21 15:36:40.622181 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:36:40.622149 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9899d71_8c55_4ec8_929c_ab8f3dcf09e9.slice/crio-a42284219ccc0b615831f5b14a0d5bf91d5d83e8f83c5245285af6cd32a40980 WatchSource:0}: Error finding container a42284219ccc0b615831f5b14a0d5bf91d5d83e8f83c5245285af6cd32a40980: Status 404 returned error can't find the container with id a42284219ccc0b615831f5b14a0d5bf91d5d83e8f83c5245285af6cd32a40980 Apr 21 15:36:40.876369 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:40.876344 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zmzmd" event={"ID":"f9899d71-8c55-4ec8-929c-ab8f3dcf09e9","Type":"ContainerStarted","Data":"a42284219ccc0b615831f5b14a0d5bf91d5d83e8f83c5245285af6cd32a40980"} Apr 21 15:36:43.883040 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:43.883003 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zmzmd" event={"ID":"f9899d71-8c55-4ec8-929c-ab8f3dcf09e9","Type":"ContainerStarted","Data":"1906339c3358bc30f76ddfee966717a292b18ab1fea83cda952be7fd3f09fb0d"} Apr 21 15:36:43.883383 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:43.883188 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:36:43.906121 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:36:43.906073 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zmzmd" podStartSLOduration=66.205350587 podStartE2EDuration="1m8.906061103s" podCreationTimestamp="2026-04-21 15:35:35 +0000 UTC" firstStartedPulling="2026-04-21 15:36:40.623963991 +0000 UTC m=+65.652130765" lastFinishedPulling="2026-04-21 15:36:43.324674493 +0000 UTC m=+68.352841281" observedRunningTime="2026-04-21 15:36:43.904490784 +0000 UTC m=+68.932657579" watchObservedRunningTime="2026-04-21 15:36:43.906061103 +0000 UTC m=+68.934227899" Apr 21 15:37:12.237714 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:12.237672 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:37:12.237714 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:12.237718 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert\") pod \"ingress-canary-klb4g\" (UID: \"dff5e891-a3c3-4526-94e0-f1c91d517e9d\") " pod="openshift-ingress-canary/ingress-canary-klb4g" Apr 21 15:37:12.238376 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:12.237820 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:37:12.238376 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:12.237881 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls podName:704d38f9-6323-48bf-b8f7-977c83275b82 nodeName:}" failed. No retries permitted until 2026-04-21 15:38:16.237865296 +0000 UTC m=+161.266032071 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls") pod "dns-default-dgpm6" (UID: "704d38f9-6323-48bf-b8f7-977c83275b82") : secret "dns-default-metrics-tls" not found Apr 21 15:37:12.238376 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:12.237825 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:37:12.238376 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:12.238013 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert podName:dff5e891-a3c3-4526-94e0-f1c91d517e9d nodeName:}" failed. No retries permitted until 2026-04-21 15:38:16.23799268 +0000 UTC m=+161.266159457 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert") pod "ingress-canary-klb4g" (UID: "dff5e891-a3c3-4526-94e0-f1c91d517e9d") : secret "canary-serving-cert" not found Apr 21 15:37:14.887462 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:14.887431 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zmzmd" Apr 21 15:37:18.243771 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.243735 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt"] Apr 21 15:37:18.247802 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.247783 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" Apr 21 15:37:18.250359 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.250340 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 15:37:18.251489 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.251469 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 15:37:18.252213 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.252199 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 15:37:18.252272 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.252200 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-6qqm6\"" Apr 21 15:37:18.252563 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.252539 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 15:37:18.255699 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.255617 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt"] Apr 21 15:37:18.273599 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.273576 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a1f66133-b065-4100-b368-ac1f349bf896-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zf8zt\" (UID: \"a1f66133-b065-4100-b368-ac1f349bf896\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" Apr 21 15:37:18.273719 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.273620 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mln4z\" (UniqueName: \"kubernetes.io/projected/a1f66133-b065-4100-b368-ac1f349bf896-kube-api-access-mln4z\") pod \"cluster-monitoring-operator-75587bd455-zf8zt\" (UID: \"a1f66133-b065-4100-b368-ac1f349bf896\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" Apr 21 15:37:18.273719 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.273707 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zf8zt\" (UID: \"a1f66133-b065-4100-b368-ac1f349bf896\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" Apr 21 15:37:18.341806 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.341781 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-npp9r"] Apr 21 15:37:18.344665 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.344639 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-npp9r" Apr 21 15:37:18.348208 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.348189 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:37:18.348321 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.348292 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-h4b6w\"" Apr 21 15:37:18.348776 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.348761 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 15:37:18.365016 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.364995 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-npp9r"] Apr 21 15:37:18.374148 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.374125 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zf8zt\" (UID: \"a1f66133-b065-4100-b368-ac1f349bf896\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" Apr 21 15:37:18.374229 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.374169 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a1f66133-b065-4100-b368-ac1f349bf896-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zf8zt\" (UID: \"a1f66133-b065-4100-b368-ac1f349bf896\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" Apr 21 15:37:18.374229 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.374190 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mln4z\" (UniqueName: \"kubernetes.io/projected/a1f66133-b065-4100-b368-ac1f349bf896-kube-api-access-mln4z\") pod \"cluster-monitoring-operator-75587bd455-zf8zt\" (UID: \"a1f66133-b065-4100-b368-ac1f349bf896\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" Apr 21 15:37:18.374229 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.374217 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmz94\" (UniqueName: \"kubernetes.io/projected/0bb5aac3-8258-4d28-bb86-c01cb26966b1-kube-api-access-tmz94\") pod \"volume-data-source-validator-7c6cbb6c87-npp9r\" (UID: \"0bb5aac3-8258-4d28-bb86-c01cb26966b1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-npp9r" Apr 21 15:37:18.374325 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:18.374259 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 15:37:18.374358 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:18.374327 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls podName:a1f66133-b065-4100-b368-ac1f349bf896 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:18.874311362 +0000 UTC m=+103.902478136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zf8zt" (UID: "a1f66133-b065-4100-b368-ac1f349bf896") : secret "cluster-monitoring-operator-tls" not found Apr 21 15:37:18.374751 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.374735 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a1f66133-b065-4100-b368-ac1f349bf896-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zf8zt\" (UID: \"a1f66133-b065-4100-b368-ac1f349bf896\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" Apr 21 15:37:18.390589 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.390567 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mln4z\" (UniqueName: \"kubernetes.io/projected/a1f66133-b065-4100-b368-ac1f349bf896-kube-api-access-mln4z\") pod \"cluster-monitoring-operator-75587bd455-zf8zt\" (UID: \"a1f66133-b065-4100-b368-ac1f349bf896\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" Apr 21 15:37:18.474944 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.474919 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmz94\" (UniqueName: \"kubernetes.io/projected/0bb5aac3-8258-4d28-bb86-c01cb26966b1-kube-api-access-tmz94\") pod \"volume-data-source-validator-7c6cbb6c87-npp9r\" (UID: \"0bb5aac3-8258-4d28-bb86-c01cb26966b1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-npp9r" Apr 21 15:37:18.486525 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.486508 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmz94\" (UniqueName: \"kubernetes.io/projected/0bb5aac3-8258-4d28-bb86-c01cb26966b1-kube-api-access-tmz94\") pod \"volume-data-source-validator-7c6cbb6c87-npp9r\" (UID: \"0bb5aac3-8258-4d28-bb86-c01cb26966b1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-npp9r" Apr 21 15:37:18.614706 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.614679 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-kwhwh"] Apr 21 15:37:18.617978 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.617963 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6686459d59-wxj7x"] Apr 21 15:37:18.618124 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.618108 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:18.620683 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.620668 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.621770 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.621754 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 15:37:18.621862 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.621834 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 15:37:18.621862 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.621839 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:37:18.621985 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.621866 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-bckpm\"" Apr 21 15:37:18.622118 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.622095 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 15:37:18.623750 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.623721 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 15:37:18.623874 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.623765 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 15:37:18.625597 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.625539 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-kwwxq\"" Apr 21 15:37:18.632035 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.632013 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-kwhwh"] Apr 21 15:37:18.634764 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.634740 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 15:37:18.634862 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.634785 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 15:37:18.638455 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.638439 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 15:37:18.645609 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.645589 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6686459d59-wxj7x"] Apr 21 15:37:18.653013 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.652996 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-npp9r" Apr 21 15:37:18.676453 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.676428 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5cbb7c2f-db4c-45f0-886d-922983c0ce02-ca-trust-extracted\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.676571 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.676477 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cbb7c2f-db4c-45f0-886d-922983c0ce02-trusted-ca\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.676571 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.676517 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vq67\" (UniqueName: \"kubernetes.io/projected/60c10748-f987-4f76-8f57-6a42bf9f4321-kube-api-access-2vq67\") pod \"console-operator-9d4b6777b-kwhwh\" (UID: \"60c10748-f987-4f76-8f57-6a42bf9f4321\") " pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:18.676686 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.676610 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5cbb7c2f-db4c-45f0-886d-922983c0ce02-image-registry-private-configuration\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.676686 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.676651 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-certificates\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.676784 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.676686 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-bound-sa-token\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.676784 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.676727 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6945m\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-kube-api-access-6945m\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.676888 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.676816 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c10748-f987-4f76-8f57-6a42bf9f4321-config\") pod \"console-operator-9d4b6777b-kwhwh\" (UID: \"60c10748-f987-4f76-8f57-6a42bf9f4321\") " pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:18.676888 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.676844 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60c10748-f987-4f76-8f57-6a42bf9f4321-trusted-ca\") pod \"console-operator-9d4b6777b-kwhwh\" (UID: \"60c10748-f987-4f76-8f57-6a42bf9f4321\") " pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:18.677010 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.676901 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5cbb7c2f-db4c-45f0-886d-922983c0ce02-installation-pull-secrets\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.679387 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.677265 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.679387 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.677312 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60c10748-f987-4f76-8f57-6a42bf9f4321-serving-cert\") pod \"console-operator-9d4b6777b-kwhwh\" (UID: \"60c10748-f987-4f76-8f57-6a42bf9f4321\") " pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:18.772773 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.772740 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-npp9r"] Apr 21 15:37:18.776852 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:37:18.776826 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bb5aac3_8258_4d28_bb86_c01cb26966b1.slice/crio-99757fe1b5297d7edce9b559ae8e91c7c1f4052ecfb658bd251a200ff8af2df2 WatchSource:0}: Error finding container 99757fe1b5297d7edce9b559ae8e91c7c1f4052ecfb658bd251a200ff8af2df2: Status 404 returned error can't find the container with id 99757fe1b5297d7edce9b559ae8e91c7c1f4052ecfb658bd251a200ff8af2df2 Apr 21 15:37:18.777656 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.777626 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5cbb7c2f-db4c-45f0-886d-922983c0ce02-ca-trust-extracted\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.777734 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.777673 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cbb7c2f-db4c-45f0-886d-922983c0ce02-trusted-ca\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.777833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.777807 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vq67\" (UniqueName: \"kubernetes.io/projected/60c10748-f987-4f76-8f57-6a42bf9f4321-kube-api-access-2vq67\") pod \"console-operator-9d4b6777b-kwhwh\" (UID: \"60c10748-f987-4f76-8f57-6a42bf9f4321\") " pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:18.777918 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.777905 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5cbb7c2f-db4c-45f0-886d-922983c0ce02-image-registry-private-configuration\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.778001 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.777935 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-certificates\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.778001 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.777985 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-bound-sa-token\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.778098 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.778018 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6945m\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-kube-api-access-6945m\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.778098 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.778052 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c10748-f987-4f76-8f57-6a42bf9f4321-config\") pod \"console-operator-9d4b6777b-kwhwh\" (UID: \"60c10748-f987-4f76-8f57-6a42bf9f4321\") " pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:18.778098 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.778076 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60c10748-f987-4f76-8f57-6a42bf9f4321-trusted-ca\") pod \"console-operator-9d4b6777b-kwhwh\" (UID: \"60c10748-f987-4f76-8f57-6a42bf9f4321\") " pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:18.778247 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.778122 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5cbb7c2f-db4c-45f0-886d-922983c0ce02-installation-pull-secrets\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.778247 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.778179 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.778247 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.778203 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60c10748-f987-4f76-8f57-6a42bf9f4321-serving-cert\") pod \"console-operator-9d4b6777b-kwhwh\" (UID: \"60c10748-f987-4f76-8f57-6a42bf9f4321\") " pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:18.778594 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.778514 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-certificates\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.778775 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.778754 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c10748-f987-4f76-8f57-6a42bf9f4321-config\") pod \"console-operator-9d4b6777b-kwhwh\" (UID: \"60c10748-f987-4f76-8f57-6a42bf9f4321\") " pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:18.778917 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.778896 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cbb7c2f-db4c-45f0-886d-922983c0ce02-trusted-ca\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.779237 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:18.779221 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:37:18.779290 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:18.779241 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6686459d59-wxj7x: secret "image-registry-tls" not found Apr 21 15:37:18.779330 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:18.779302 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls podName:5cbb7c2f-db4c-45f0-886d-922983c0ce02 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:19.279283537 +0000 UTC m=+104.307450325 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls") pod "image-registry-6686459d59-wxj7x" (UID: "5cbb7c2f-db4c-45f0-886d-922983c0ce02") : secret "image-registry-tls" not found Apr 21 15:37:18.779370 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.779331 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5cbb7c2f-db4c-45f0-886d-922983c0ce02-ca-trust-extracted\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.780011 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.779707 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60c10748-f987-4f76-8f57-6a42bf9f4321-trusted-ca\") pod \"console-operator-9d4b6777b-kwhwh\" (UID: \"60c10748-f987-4f76-8f57-6a42bf9f4321\") " pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:18.781142 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.781120 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60c10748-f987-4f76-8f57-6a42bf9f4321-serving-cert\") pod \"console-operator-9d4b6777b-kwhwh\" (UID: \"60c10748-f987-4f76-8f57-6a42bf9f4321\") " pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:18.781580 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.781556 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5cbb7c2f-db4c-45f0-886d-922983c0ce02-installation-pull-secrets\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.781879 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.781860 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5cbb7c2f-db4c-45f0-886d-922983c0ce02-image-registry-private-configuration\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.794391 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.794368 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6945m\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-kube-api-access-6945m\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.796405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.796385 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-bound-sa-token\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:18.797722 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.797703 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vq67\" (UniqueName: \"kubernetes.io/projected/60c10748-f987-4f76-8f57-6a42bf9f4321-kube-api-access-2vq67\") pod \"console-operator-9d4b6777b-kwhwh\" (UID: \"60c10748-f987-4f76-8f57-6a42bf9f4321\") " pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:18.879227 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.879164 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zf8zt\" (UID: \"a1f66133-b065-4100-b368-ac1f349bf896\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" Apr 21 15:37:18.879317 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:18.879271 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 15:37:18.879379 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:18.879331 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls podName:a1f66133-b065-4100-b368-ac1f349bf896 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:19.879313228 +0000 UTC m=+104.907480019 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zf8zt" (UID: "a1f66133-b065-4100-b368-ac1f349bf896") : secret "cluster-monitoring-operator-tls" not found Apr 21 15:37:18.928498 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.928464 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:18.949801 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:18.949770 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-npp9r" event={"ID":"0bb5aac3-8258-4d28-bb86-c01cb26966b1","Type":"ContainerStarted","Data":"99757fe1b5297d7edce9b559ae8e91c7c1f4052ecfb658bd251a200ff8af2df2"} Apr 21 15:37:19.045668 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:19.045642 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-kwhwh"] Apr 21 15:37:19.049384 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:37:19.049359 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60c10748_f987_4f76_8f57_6a42bf9f4321.slice/crio-c78f9bb90345fc67037aff076b839943efe78daa702adb4a819b0b905d2ca553 WatchSource:0}: Error finding container c78f9bb90345fc67037aff076b839943efe78daa702adb4a819b0b905d2ca553: Status 404 returned error can't find the container with id c78f9bb90345fc67037aff076b839943efe78daa702adb4a819b0b905d2ca553 Apr 21 15:37:19.281641 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:19.281569 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:19.282024 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:19.281694 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:37:19.282024 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:19.281710 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6686459d59-wxj7x: secret "image-registry-tls" not found Apr 21 15:37:19.282024 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:19.281762 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls podName:5cbb7c2f-db4c-45f0-886d-922983c0ce02 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:20.281747465 +0000 UTC m=+105.309914239 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls") pod "image-registry-6686459d59-wxj7x" (UID: "5cbb7c2f-db4c-45f0-886d-922983c0ce02") : secret "image-registry-tls" not found Apr 21 15:37:19.886773 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:19.886701 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zf8zt\" (UID: \"a1f66133-b065-4100-b368-ac1f349bf896\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" Apr 21 15:37:19.886975 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:19.886836 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 15:37:19.886975 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:19.886926 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls podName:a1f66133-b065-4100-b368-ac1f349bf896 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:21.886903634 +0000 UTC m=+106.915070413 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zf8zt" (UID: "a1f66133-b065-4100-b368-ac1f349bf896") : secret "cluster-monitoring-operator-tls" not found Apr 21 15:37:19.953078 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:19.953028 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" event={"ID":"60c10748-f987-4f76-8f57-6a42bf9f4321","Type":"ContainerStarted","Data":"c78f9bb90345fc67037aff076b839943efe78daa702adb4a819b0b905d2ca553"} Apr 21 15:37:20.290799 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:20.290758 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:20.291106 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:20.290900 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:37:20.291106 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:20.290918 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6686459d59-wxj7x: secret "image-registry-tls" not found Apr 21 15:37:20.291106 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:20.290989 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls podName:5cbb7c2f-db4c-45f0-886d-922983c0ce02 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:22.290970396 +0000 UTC m=+107.319137170 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls") pod "image-registry-6686459d59-wxj7x" (UID: "5cbb7c2f-db4c-45f0-886d-922983c0ce02") : secret "image-registry-tls" not found Apr 21 15:37:20.956125 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:20.956095 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-npp9r" event={"ID":"0bb5aac3-8258-4d28-bb86-c01cb26966b1","Type":"ContainerStarted","Data":"41a946bb97db82fa4997871d12391352e1d24abaf3776e3dfe4c6a75de19a828"} Apr 21 15:37:20.984068 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:20.984025 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-npp9r" podStartSLOduration=1.58513058 podStartE2EDuration="2.984007922s" podCreationTimestamp="2026-04-21 15:37:18 +0000 UTC" firstStartedPulling="2026-04-21 15:37:18.778786787 +0000 UTC m=+103.806953563" lastFinishedPulling="2026-04-21 15:37:20.177664128 +0000 UTC m=+105.205830905" observedRunningTime="2026-04-21 15:37:20.983468843 +0000 UTC m=+106.011635638" watchObservedRunningTime="2026-04-21 15:37:20.984007922 +0000 UTC m=+106.012174720" Apr 21 15:37:21.904415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:21.904384 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zf8zt\" (UID: \"a1f66133-b065-4100-b368-ac1f349bf896\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" Apr 21 15:37:21.904769 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:21.904520 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 15:37:21.904769 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:21.904581 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls podName:a1f66133-b065-4100-b368-ac1f349bf896 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:25.904563266 +0000 UTC m=+110.932730039 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zf8zt" (UID: "a1f66133-b065-4100-b368-ac1f349bf896") : secret "cluster-monitoring-operator-tls" not found Apr 21 15:37:21.959036 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:21.959014 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/0.log" Apr 21 15:37:21.959151 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:21.959049 2579 generic.go:358] "Generic (PLEG): container finished" podID="60c10748-f987-4f76-8f57-6a42bf9f4321" containerID="828107d318e8586d25bbe8c47e0ecfbf3c01bdf5cfaae8ddcd10944af01223a1" exitCode=255 Apr 21 15:37:21.959151 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:21.959131 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" event={"ID":"60c10748-f987-4f76-8f57-6a42bf9f4321","Type":"ContainerDied","Data":"828107d318e8586d25bbe8c47e0ecfbf3c01bdf5cfaae8ddcd10944af01223a1"} Apr 21 15:37:21.959308 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:21.959293 2579 scope.go:117] "RemoveContainer" containerID="828107d318e8586d25bbe8c47e0ecfbf3c01bdf5cfaae8ddcd10944af01223a1" Apr 21 15:37:22.307687 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:22.307611 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:22.307809 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:22.307725 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:37:22.307809 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:22.307736 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6686459d59-wxj7x: secret "image-registry-tls" not found Apr 21 15:37:22.307809 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:22.307787 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls podName:5cbb7c2f-db4c-45f0-886d-922983c0ce02 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:26.307772112 +0000 UTC m=+111.335938885 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls") pod "image-registry-6686459d59-wxj7x" (UID: "5cbb7c2f-db4c-45f0-886d-922983c0ce02") : secret "image-registry-tls" not found Apr 21 15:37:22.962240 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:22.962212 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/1.log" Apr 21 15:37:22.962698 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:22.962582 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/0.log" Apr 21 15:37:22.962698 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:22.962618 2579 generic.go:358] "Generic (PLEG): container finished" podID="60c10748-f987-4f76-8f57-6a42bf9f4321" containerID="16b889f0a1d6c2486f297174a83e63ea85b50aaad62fd84727f8a5983d8f0e01" exitCode=255 Apr 21 15:37:22.962698 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:22.962655 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" event={"ID":"60c10748-f987-4f76-8f57-6a42bf9f4321","Type":"ContainerDied","Data":"16b889f0a1d6c2486f297174a83e63ea85b50aaad62fd84727f8a5983d8f0e01"} Apr 21 15:37:22.962848 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:22.962711 2579 scope.go:117] "RemoveContainer" containerID="828107d318e8586d25bbe8c47e0ecfbf3c01bdf5cfaae8ddcd10944af01223a1" Apr 21 15:37:22.962931 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:22.962915 2579 scope.go:117] "RemoveContainer" containerID="16b889f0a1d6c2486f297174a83e63ea85b50aaad62fd84727f8a5983d8f0e01" Apr 21 15:37:22.963146 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:22.963126 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-kwhwh_openshift-console-operator(60c10748-f987-4f76-8f57-6a42bf9f4321)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" podUID="60c10748-f987-4f76-8f57-6a42bf9f4321" Apr 21 15:37:23.234041 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:23.233978 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-l82qb"] Apr 21 15:37:23.237861 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:23.237846 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l82qb" Apr 21 15:37:23.241034 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:23.241007 2579 status_manager.go:895] "Failed to get status for pod" podUID="d9332cef-c45c-4842-b3bb-c9aa72e9fbf5" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l82qb" err="pods \"migrator-74bb7799d9-l82qb\" is forbidden: User \"system:node:ip-10-0-132-141.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-kube-storage-version-migrator\": no relationship found between node 'ip-10-0-132-141.ec2.internal' and this object" Apr 21 15:37:23.241135 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:23.241083 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"kube-storage-version-migrator-sa-dockercfg-7vtdz\" is forbidden: User \"system:node:ip-10-0-132-141.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-kube-storage-version-migrator\": no relationship found between node 'ip-10-0-132-141.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-7vtdz\"" type="*v1.Secret" Apr 21 15:37:23.241299 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:23.241273 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-132-141.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-kube-storage-version-migrator\": no relationship found between node 'ip-10-0-132-141.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 21 15:37:23.241354 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:23.241311 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-132-141.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-kube-storage-version-migrator\": no relationship found between node 'ip-10-0-132-141.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 21 15:37:23.248469 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:23.248446 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-l82qb"] Apr 21 15:37:23.316044 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:23.316018 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b67wc\" (UniqueName: \"kubernetes.io/projected/d9332cef-c45c-4842-b3bb-c9aa72e9fbf5-kube-api-access-b67wc\") pod \"migrator-74bb7799d9-l82qb\" (UID: \"d9332cef-c45c-4842-b3bb-c9aa72e9fbf5\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l82qb" Apr 21 15:37:23.416715 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:23.416695 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b67wc\" (UniqueName: \"kubernetes.io/projected/d9332cef-c45c-4842-b3bb-c9aa72e9fbf5-kube-api-access-b67wc\") pod \"migrator-74bb7799d9-l82qb\" (UID: \"d9332cef-c45c-4842-b3bb-c9aa72e9fbf5\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l82qb" Apr 21 15:37:23.965853 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:23.965825 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/1.log" Apr 21 15:37:23.966259 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:23.966135 2579 scope.go:117] "RemoveContainer" containerID="16b889f0a1d6c2486f297174a83e63ea85b50aaad62fd84727f8a5983d8f0e01" Apr 21 15:37:23.966313 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:23.966280 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-kwhwh_openshift-console-operator(60c10748-f987-4f76-8f57-6a42bf9f4321)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" podUID="60c10748-f987-4f76-8f57-6a42bf9f4321" Apr 21 15:37:24.399471 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:24.399445 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 15:37:24.673317 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:24.673252 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-7vtdz\"" Apr 21 15:37:24.773382 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:24.773351 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 15:37:24.785119 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:24.785091 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b67wc\" (UniqueName: \"kubernetes.io/projected/d9332cef-c45c-4842-b3bb-c9aa72e9fbf5-kube-api-access-b67wc\") pod \"migrator-74bb7799d9-l82qb\" (UID: \"d9332cef-c45c-4842-b3bb-c9aa72e9fbf5\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l82qb" Apr 21 15:37:25.046105 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.046039 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l82qb" Apr 21 15:37:25.164861 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.164831 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-l82qb"] Apr 21 15:37:25.167664 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:37:25.167638 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9332cef_c45c_4842_b3bb_c9aa72e9fbf5.slice/crio-445db391db40840a1869f7ac04668b6031348410324d2709e4253eb78b742b00 WatchSource:0}: Error finding container 445db391db40840a1869f7ac04668b6031348410324d2709e4253eb78b742b00: Status 404 returned error can't find the container with id 445db391db40840a1869f7ac04668b6031348410324d2709e4253eb78b742b00 Apr 21 15:37:25.777352 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.777313 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-dvkn2"] Apr 21 15:37:25.780280 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.780236 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-dvkn2" Apr 21 15:37:25.785467 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.785447 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 21 15:37:25.785786 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.785765 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 21 15:37:25.785877 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.785765 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 21 15:37:25.786045 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.786025 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-wjftd\"" Apr 21 15:37:25.786045 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.786038 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 21 15:37:25.789933 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.789896 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-dvkn2"] Apr 21 15:37:25.836897 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.836871 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f2256f66-f9fe-498e-b7d1-0d0476078598-signing-key\") pod \"service-ca-865cb79987-dvkn2\" (UID: \"f2256f66-f9fe-498e-b7d1-0d0476078598\") " pod="openshift-service-ca/service-ca-865cb79987-dvkn2" Apr 21 15:37:25.836897 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.836903 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78n2z\" (UniqueName: \"kubernetes.io/projected/f2256f66-f9fe-498e-b7d1-0d0476078598-kube-api-access-78n2z\") pod \"service-ca-865cb79987-dvkn2\" (UID: \"f2256f66-f9fe-498e-b7d1-0d0476078598\") " pod="openshift-service-ca/service-ca-865cb79987-dvkn2" Apr 21 15:37:25.837065 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.836978 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f2256f66-f9fe-498e-b7d1-0d0476078598-signing-cabundle\") pod \"service-ca-865cb79987-dvkn2\" (UID: \"f2256f66-f9fe-498e-b7d1-0d0476078598\") " pod="openshift-service-ca/service-ca-865cb79987-dvkn2" Apr 21 15:37:25.937768 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.937738 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zf8zt\" (UID: \"a1f66133-b065-4100-b368-ac1f349bf896\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" Apr 21 15:37:25.937883 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.937771 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f2256f66-f9fe-498e-b7d1-0d0476078598-signing-key\") pod \"service-ca-865cb79987-dvkn2\" (UID: \"f2256f66-f9fe-498e-b7d1-0d0476078598\") " pod="openshift-service-ca/service-ca-865cb79987-dvkn2" Apr 21 15:37:25.937883 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.937793 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78n2z\" (UniqueName: \"kubernetes.io/projected/f2256f66-f9fe-498e-b7d1-0d0476078598-kube-api-access-78n2z\") pod \"service-ca-865cb79987-dvkn2\" (UID: \"f2256f66-f9fe-498e-b7d1-0d0476078598\") " pod="openshift-service-ca/service-ca-865cb79987-dvkn2" Apr 21 15:37:25.937988 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:25.937962 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 15:37:25.938025 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.938013 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f2256f66-f9fe-498e-b7d1-0d0476078598-signing-cabundle\") pod \"service-ca-865cb79987-dvkn2\" (UID: \"f2256f66-f9fe-498e-b7d1-0d0476078598\") " pod="openshift-service-ca/service-ca-865cb79987-dvkn2" Apr 21 15:37:25.938058 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:25.938019 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls podName:a1f66133-b065-4100-b368-ac1f349bf896 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:33.938003022 +0000 UTC m=+118.966169799 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zf8zt" (UID: "a1f66133-b065-4100-b368-ac1f349bf896") : secret "cluster-monitoring-operator-tls" not found Apr 21 15:37:25.938582 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.938555 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f2256f66-f9fe-498e-b7d1-0d0476078598-signing-cabundle\") pod \"service-ca-865cb79987-dvkn2\" (UID: \"f2256f66-f9fe-498e-b7d1-0d0476078598\") " pod="openshift-service-ca/service-ca-865cb79987-dvkn2" Apr 21 15:37:25.940225 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.940205 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f2256f66-f9fe-498e-b7d1-0d0476078598-signing-key\") pod \"service-ca-865cb79987-dvkn2\" (UID: \"f2256f66-f9fe-498e-b7d1-0d0476078598\") " pod="openshift-service-ca/service-ca-865cb79987-dvkn2" Apr 21 15:37:25.946283 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.946265 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78n2z\" (UniqueName: \"kubernetes.io/projected/f2256f66-f9fe-498e-b7d1-0d0476078598-kube-api-access-78n2z\") pod \"service-ca-865cb79987-dvkn2\" (UID: \"f2256f66-f9fe-498e-b7d1-0d0476078598\") " pod="openshift-service-ca/service-ca-865cb79987-dvkn2" Apr 21 15:37:25.971600 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:25.971571 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l82qb" event={"ID":"d9332cef-c45c-4842-b3bb-c9aa72e9fbf5","Type":"ContainerStarted","Data":"445db391db40840a1869f7ac04668b6031348410324d2709e4253eb78b742b00"} Apr 21 15:37:26.012655 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:26.012635 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2xzlk_a396e4e6-5a05-450a-8a8c-263dd9674c34/dns-node-resolver/0.log" Apr 21 15:37:26.091211 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:26.091185 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-dvkn2" Apr 21 15:37:26.341405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:26.341352 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:26.341523 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:26.341483 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:37:26.341523 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:26.341498 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6686459d59-wxj7x: secret "image-registry-tls" not found Apr 21 15:37:26.341608 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:26.341548 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls podName:5cbb7c2f-db4c-45f0-886d-922983c0ce02 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:34.341531794 +0000 UTC m=+119.369698568 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls") pod "image-registry-6686459d59-wxj7x" (UID: "5cbb7c2f-db4c-45f0-886d-922983c0ce02") : secret "image-registry-tls" not found Apr 21 15:37:26.360778 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:26.360752 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-dvkn2"] Apr 21 15:37:26.364334 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:37:26.364312 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2256f66_f9fe_498e_b7d1_0d0476078598.slice/crio-6b6f22bead6d50b90bd13d840fa0305fbde9b1d036dd69a8b26ba9a4ca7b2026 WatchSource:0}: Error finding container 6b6f22bead6d50b90bd13d840fa0305fbde9b1d036dd69a8b26ba9a4ca7b2026: Status 404 returned error can't find the container with id 6b6f22bead6d50b90bd13d840fa0305fbde9b1d036dd69a8b26ba9a4ca7b2026 Apr 21 15:37:26.976432 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:26.976387 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l82qb" event={"ID":"d9332cef-c45c-4842-b3bb-c9aa72e9fbf5","Type":"ContainerStarted","Data":"5469f9eaaedf6fe9f87891d422f531db4c085138e1dfb7600628f9b9c20b0ee7"} Apr 21 15:37:26.976432 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:26.976427 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l82qb" event={"ID":"d9332cef-c45c-4842-b3bb-c9aa72e9fbf5","Type":"ContainerStarted","Data":"0813d49763cdea66b6a583af2f10b68e395a18ce2cb930945f692cafc2a97270"} Apr 21 15:37:26.977600 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:26.977559 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-dvkn2" event={"ID":"f2256f66-f9fe-498e-b7d1-0d0476078598","Type":"ContainerStarted","Data":"6b6f22bead6d50b90bd13d840fa0305fbde9b1d036dd69a8b26ba9a4ca7b2026"} Apr 21 15:37:26.998728 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:26.998648 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l82qb" podStartSLOduration=2.880374136 podStartE2EDuration="3.998638007s" podCreationTimestamp="2026-04-21 15:37:23 +0000 UTC" firstStartedPulling="2026-04-21 15:37:25.169328398 +0000 UTC m=+110.197495176" lastFinishedPulling="2026-04-21 15:37:26.287592258 +0000 UTC m=+111.315759047" observedRunningTime="2026-04-21 15:37:26.998358747 +0000 UTC m=+112.026525554" watchObservedRunningTime="2026-04-21 15:37:26.998638007 +0000 UTC m=+112.026804803" Apr 21 15:37:27.015664 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:27.015644 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fj5g2_657382f2-3c88-4d85-b5cf-5533d6e4b19e/node-ca/0.log" Apr 21 15:37:28.414030 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:28.414004 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-l82qb_d9332cef-c45c-4842-b3bb-c9aa72e9fbf5/migrator/0.log" Apr 21 15:37:28.615605 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:28.615572 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-l82qb_d9332cef-c45c-4842-b3bb-c9aa72e9fbf5/graceful-termination/0.log" Apr 21 15:37:28.928722 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:28.928681 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:28.928722 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:28.928729 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:28.929199 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:28.929185 2579 scope.go:117] "RemoveContainer" containerID="16b889f0a1d6c2486f297174a83e63ea85b50aaad62fd84727f8a5983d8f0e01" Apr 21 15:37:28.929413 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:28.929393 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-kwhwh_openshift-console-operator(60c10748-f987-4f76-8f57-6a42bf9f4321)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" podUID="60c10748-f987-4f76-8f57-6a42bf9f4321" Apr 21 15:37:28.984479 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:28.984445 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-dvkn2" event={"ID":"f2256f66-f9fe-498e-b7d1-0d0476078598","Type":"ContainerStarted","Data":"cffd0ac47ece0c840c9dd4fa4ea66b281520fa14dab3cbd97899fc237b001e5e"} Apr 21 15:37:33.999487 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:33.999454 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zf8zt\" (UID: \"a1f66133-b065-4100-b368-ac1f349bf896\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" Apr 21 15:37:33.999855 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:33.999588 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 15:37:33.999855 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:33.999646 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls podName:a1f66133-b065-4100-b368-ac1f349bf896 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:49.999630613 +0000 UTC m=+135.027797387 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zf8zt" (UID: "a1f66133-b065-4100-b368-ac1f349bf896") : secret "cluster-monitoring-operator-tls" not found Apr 21 15:37:34.403251 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:34.403225 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:34.405865 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:34.405841 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls\") pod \"image-registry-6686459d59-wxj7x\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:34.534076 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:34.534050 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:34.698232 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:34.698110 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-dvkn2" podStartSLOduration=8.121445761 podStartE2EDuration="9.698094487s" podCreationTimestamp="2026-04-21 15:37:25 +0000 UTC" firstStartedPulling="2026-04-21 15:37:26.366171893 +0000 UTC m=+111.394338667" lastFinishedPulling="2026-04-21 15:37:27.942820619 +0000 UTC m=+112.970987393" observedRunningTime="2026-04-21 15:37:29.008078974 +0000 UTC m=+114.036245772" watchObservedRunningTime="2026-04-21 15:37:34.698094487 +0000 UTC m=+119.726261282" Apr 21 15:37:34.699045 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:34.699026 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6686459d59-wxj7x"] Apr 21 15:37:34.702608 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:37:34.702568 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cbb7c2f_db4c_45f0_886d_922983c0ce02.slice/crio-cb2e4478e8e416198e033916cd3774f02672503ff2cab87c91d7b2a57a76c136 WatchSource:0}: Error finding container cb2e4478e8e416198e033916cd3774f02672503ff2cab87c91d7b2a57a76c136: Status 404 returned error can't find the container with id cb2e4478e8e416198e033916cd3774f02672503ff2cab87c91d7b2a57a76c136 Apr 21 15:37:35.004038 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:35.003929 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6686459d59-wxj7x" event={"ID":"5cbb7c2f-db4c-45f0-886d-922983c0ce02","Type":"ContainerStarted","Data":"13e67d9cddaf8b90e9e1bcebf4224044b9c3c4f91c80c90076992a9cbc6e2e9a"} Apr 21 15:37:35.004038 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:35.004000 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6686459d59-wxj7x" event={"ID":"5cbb7c2f-db4c-45f0-886d-922983c0ce02","Type":"ContainerStarted","Data":"cb2e4478e8e416198e033916cd3774f02672503ff2cab87c91d7b2a57a76c136"} Apr 21 15:37:35.004490 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:35.004041 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:37:35.027109 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:35.027065 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6686459d59-wxj7x" podStartSLOduration=17.027030239 podStartE2EDuration="17.027030239s" podCreationTimestamp="2026-04-21 15:37:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:37:35.025447384 +0000 UTC m=+120.053614180" watchObservedRunningTime="2026-04-21 15:37:35.027030239 +0000 UTC m=+120.055197073" Apr 21 15:37:41.604384 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:41.604350 2579 scope.go:117] "RemoveContainer" containerID="16b889f0a1d6c2486f297174a83e63ea85b50aaad62fd84727f8a5983d8f0e01" Apr 21 15:37:42.022772 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:42.022702 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/2.log" Apr 21 15:37:42.023118 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:42.023103 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/1.log" Apr 21 15:37:42.023173 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:42.023136 2579 generic.go:358] "Generic (PLEG): container finished" podID="60c10748-f987-4f76-8f57-6a42bf9f4321" containerID="0685941fa7ac6328ac83bf696c6b644990d3eebc8cd2c5996db4e41cf4b04c46" exitCode=255 Apr 21 15:37:42.023173 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:42.023164 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" event={"ID":"60c10748-f987-4f76-8f57-6a42bf9f4321","Type":"ContainerDied","Data":"0685941fa7ac6328ac83bf696c6b644990d3eebc8cd2c5996db4e41cf4b04c46"} Apr 21 15:37:42.023242 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:42.023191 2579 scope.go:117] "RemoveContainer" containerID="16b889f0a1d6c2486f297174a83e63ea85b50aaad62fd84727f8a5983d8f0e01" Apr 21 15:37:42.023545 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:42.023527 2579 scope.go:117] "RemoveContainer" containerID="0685941fa7ac6328ac83bf696c6b644990d3eebc8cd2c5996db4e41cf4b04c46" Apr 21 15:37:42.023741 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:42.023718 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-kwhwh_openshift-console-operator(60c10748-f987-4f76-8f57-6a42bf9f4321)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" podUID="60c10748-f987-4f76-8f57-6a42bf9f4321" Apr 21 15:37:43.026615 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:43.026587 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/2.log" Apr 21 15:37:44.276691 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:44.276660 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs\") pod \"network-metrics-daemon-x5zkt\" (UID: \"088bc7e8-4515-4c77-967b-a70ef32cd85e\") " pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:37:44.279078 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:44.279056 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/088bc7e8-4515-4c77-967b-a70ef32cd85e-metrics-certs\") pod \"network-metrics-daemon-x5zkt\" (UID: \"088bc7e8-4515-4c77-967b-a70ef32cd85e\") " pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:37:44.318426 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:44.318406 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sjbkh\"" Apr 21 15:37:44.326697 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:44.326680 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5zkt" Apr 21 15:37:44.463568 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:44.463536 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x5zkt"] Apr 21 15:37:44.467167 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:37:44.467141 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod088bc7e8_4515_4c77_967b_a70ef32cd85e.slice/crio-f7749bddde4aeee35571a29410cd96a595a8c4426ff6a1130b83b91da97d1063 WatchSource:0}: Error finding container f7749bddde4aeee35571a29410cd96a595a8c4426ff6a1130b83b91da97d1063: Status 404 returned error can't find the container with id f7749bddde4aeee35571a29410cd96a595a8c4426ff6a1130b83b91da97d1063 Apr 21 15:37:45.032279 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.032240 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x5zkt" event={"ID":"088bc7e8-4515-4c77-967b-a70ef32cd85e","Type":"ContainerStarted","Data":"f7749bddde4aeee35571a29410cd96a595a8c4426ff6a1130b83b91da97d1063"} Apr 21 15:37:45.587312 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.587279 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-48m9g"] Apr 21 15:37:45.590579 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.590559 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6686459d59-wxj7x"] Apr 21 15:37:45.590835 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.590815 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-48m9g" Apr 21 15:37:45.597404 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.597384 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 15:37:45.597511 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.597495 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-shndr\"" Apr 21 15:37:45.597629 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.597616 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 15:37:45.602613 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.602595 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 15:37:45.604652 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.604631 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 15:37:45.613568 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.613550 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-48m9g"] Apr 21 15:37:45.689064 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.689004 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ff5f3b20-ad56-4139-ac9f-e02877e78f48-crio-socket\") pod \"insights-runtime-extractor-48m9g\" (UID: \"ff5f3b20-ad56-4139-ac9f-e02877e78f48\") " pod="openshift-insights/insights-runtime-extractor-48m9g" Apr 21 15:37:45.689064 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.689046 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ff5f3b20-ad56-4139-ac9f-e02877e78f48-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-48m9g\" (UID: \"ff5f3b20-ad56-4139-ac9f-e02877e78f48\") " pod="openshift-insights/insights-runtime-extractor-48m9g" Apr 21 15:37:45.689228 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.689117 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ff5f3b20-ad56-4139-ac9f-e02877e78f48-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-48m9g\" (UID: \"ff5f3b20-ad56-4139-ac9f-e02877e78f48\") " pod="openshift-insights/insights-runtime-extractor-48m9g" Apr 21 15:37:45.689228 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.689159 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff5f3b20-ad56-4139-ac9f-e02877e78f48-data-volume\") pod \"insights-runtime-extractor-48m9g\" (UID: \"ff5f3b20-ad56-4139-ac9f-e02877e78f48\") " pod="openshift-insights/insights-runtime-extractor-48m9g" Apr 21 15:37:45.689228 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.689184 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbctm\" (UniqueName: \"kubernetes.io/projected/ff5f3b20-ad56-4139-ac9f-e02877e78f48-kube-api-access-kbctm\") pod \"insights-runtime-extractor-48m9g\" (UID: \"ff5f3b20-ad56-4139-ac9f-e02877e78f48\") " pod="openshift-insights/insights-runtime-extractor-48m9g" Apr 21 15:37:45.790434 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.790404 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ff5f3b20-ad56-4139-ac9f-e02877e78f48-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-48m9g\" (UID: \"ff5f3b20-ad56-4139-ac9f-e02877e78f48\") " pod="openshift-insights/insights-runtime-extractor-48m9g" Apr 21 15:37:45.790568 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.790444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff5f3b20-ad56-4139-ac9f-e02877e78f48-data-volume\") pod \"insights-runtime-extractor-48m9g\" (UID: \"ff5f3b20-ad56-4139-ac9f-e02877e78f48\") " pod="openshift-insights/insights-runtime-extractor-48m9g" Apr 21 15:37:45.790568 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.790475 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbctm\" (UniqueName: \"kubernetes.io/projected/ff5f3b20-ad56-4139-ac9f-e02877e78f48-kube-api-access-kbctm\") pod \"insights-runtime-extractor-48m9g\" (UID: \"ff5f3b20-ad56-4139-ac9f-e02877e78f48\") " pod="openshift-insights/insights-runtime-extractor-48m9g" Apr 21 15:37:45.790568 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.790524 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ff5f3b20-ad56-4139-ac9f-e02877e78f48-crio-socket\") pod \"insights-runtime-extractor-48m9g\" (UID: \"ff5f3b20-ad56-4139-ac9f-e02877e78f48\") " pod="openshift-insights/insights-runtime-extractor-48m9g" Apr 21 15:37:45.790734 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.790572 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ff5f3b20-ad56-4139-ac9f-e02877e78f48-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-48m9g\" (UID: \"ff5f3b20-ad56-4139-ac9f-e02877e78f48\") " pod="openshift-insights/insights-runtime-extractor-48m9g" Apr 21 15:37:45.790734 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.790692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ff5f3b20-ad56-4139-ac9f-e02877e78f48-crio-socket\") pod \"insights-runtime-extractor-48m9g\" (UID: \"ff5f3b20-ad56-4139-ac9f-e02877e78f48\") " pod="openshift-insights/insights-runtime-extractor-48m9g" Apr 21 15:37:45.790846 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.790825 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff5f3b20-ad56-4139-ac9f-e02877e78f48-data-volume\") pod \"insights-runtime-extractor-48m9g\" (UID: \"ff5f3b20-ad56-4139-ac9f-e02877e78f48\") " pod="openshift-insights/insights-runtime-extractor-48m9g" Apr 21 15:37:45.791074 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.791052 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ff5f3b20-ad56-4139-ac9f-e02877e78f48-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-48m9g\" (UID: \"ff5f3b20-ad56-4139-ac9f-e02877e78f48\") " pod="openshift-insights/insights-runtime-extractor-48m9g" Apr 21 15:37:45.792915 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.792895 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ff5f3b20-ad56-4139-ac9f-e02877e78f48-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-48m9g\" (UID: \"ff5f3b20-ad56-4139-ac9f-e02877e78f48\") " pod="openshift-insights/insights-runtime-extractor-48m9g" Apr 21 15:37:45.810980 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.810936 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbctm\" (UniqueName: \"kubernetes.io/projected/ff5f3b20-ad56-4139-ac9f-e02877e78f48-kube-api-access-kbctm\") pod \"insights-runtime-extractor-48m9g\" (UID: \"ff5f3b20-ad56-4139-ac9f-e02877e78f48\") " pod="openshift-insights/insights-runtime-extractor-48m9g" Apr 21 15:37:45.918448 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:45.918423 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-48m9g" Apr 21 15:37:46.036453 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:46.036423 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x5zkt" event={"ID":"088bc7e8-4515-4c77-967b-a70ef32cd85e","Type":"ContainerStarted","Data":"f7a8a0fd0570389b8324b3e8c8163a499588f66e04a1ec346088aef57808e63c"} Apr 21 15:37:46.036453 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:46.036456 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x5zkt" event={"ID":"088bc7e8-4515-4c77-967b-a70ef32cd85e","Type":"ContainerStarted","Data":"cc9ab6e640ffccd01afc9de5ba99f9cf442b61c539f998619e6ed058fced83be"} Apr 21 15:37:46.050601 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:46.050577 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-48m9g"] Apr 21 15:37:46.053716 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:37:46.053690 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff5f3b20_ad56_4139_ac9f_e02877e78f48.slice/crio-f5d41880b016d307273b3cb4e976609347f7fe343415c9533dfa3cf6587aa250 WatchSource:0}: Error finding container f5d41880b016d307273b3cb4e976609347f7fe343415c9533dfa3cf6587aa250: Status 404 returned error can't find the container with id f5d41880b016d307273b3cb4e976609347f7fe343415c9533dfa3cf6587aa250 Apr 21 15:37:46.073934 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:46.073881 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-x5zkt" podStartSLOduration=130.136358388 podStartE2EDuration="2m11.073864197s" podCreationTimestamp="2026-04-21 15:35:35 +0000 UTC" firstStartedPulling="2026-04-21 15:37:44.468994617 +0000 UTC m=+129.497161391" lastFinishedPulling="2026-04-21 15:37:45.406500413 +0000 UTC m=+130.434667200" observedRunningTime="2026-04-21 15:37:46.072521567 +0000 UTC m=+131.100688363" watchObservedRunningTime="2026-04-21 15:37:46.073864197 +0000 UTC m=+131.102030996" Apr 21 15:37:47.041364 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:47.041281 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-48m9g" event={"ID":"ff5f3b20-ad56-4139-ac9f-e02877e78f48","Type":"ContainerStarted","Data":"e4e2d843e7def2ee456e9284faeb869a262938735f2ce7472eae6385dc0d1d10"} Apr 21 15:37:47.041364 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:47.041325 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-48m9g" event={"ID":"ff5f3b20-ad56-4139-ac9f-e02877e78f48","Type":"ContainerStarted","Data":"0aa3b51c673a3a288b176617062cfd553d7620c993ceedfb60d45b435922cdc2"} Apr 21 15:37:47.041364 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:47.041339 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-48m9g" event={"ID":"ff5f3b20-ad56-4139-ac9f-e02877e78f48","Type":"ContainerStarted","Data":"f5d41880b016d307273b3cb4e976609347f7fe343415c9533dfa3cf6587aa250"} Apr 21 15:37:48.929249 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:48.929213 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:48.929602 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:48.929305 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:37:48.929602 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:48.929530 2579 scope.go:117] "RemoveContainer" containerID="0685941fa7ac6328ac83bf696c6b644990d3eebc8cd2c5996db4e41cf4b04c46" Apr 21 15:37:48.929695 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:48.929673 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-kwhwh_openshift-console-operator(60c10748-f987-4f76-8f57-6a42bf9f4321)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" podUID="60c10748-f987-4f76-8f57-6a42bf9f4321" Apr 21 15:37:49.051928 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:49.051897 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-48m9g" event={"ID":"ff5f3b20-ad56-4139-ac9f-e02877e78f48","Type":"ContainerStarted","Data":"1f4821704854c37bbccd6684e6e7f00c499913e1337224c63706111c9d78c609"} Apr 21 15:37:49.052330 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:49.052311 2579 scope.go:117] "RemoveContainer" containerID="0685941fa7ac6328ac83bf696c6b644990d3eebc8cd2c5996db4e41cf4b04c46" Apr 21 15:37:49.052508 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:49.052490 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-kwhwh_openshift-console-operator(60c10748-f987-4f76-8f57-6a42bf9f4321)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" podUID="60c10748-f987-4f76-8f57-6a42bf9f4321" Apr 21 15:37:49.071237 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:49.071189 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-48m9g" podStartSLOduration=2.084094534 podStartE2EDuration="4.071173817s" podCreationTimestamp="2026-04-21 15:37:45 +0000 UTC" firstStartedPulling="2026-04-21 15:37:46.1074454 +0000 UTC m=+131.135612173" lastFinishedPulling="2026-04-21 15:37:48.094524678 +0000 UTC m=+133.122691456" observedRunningTime="2026-04-21 15:37:49.069725817 +0000 UTC m=+134.097892614" watchObservedRunningTime="2026-04-21 15:37:49.071173817 +0000 UTC m=+134.099340617" Apr 21 15:37:50.020221 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:50.020180 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zf8zt\" (UID: \"a1f66133-b065-4100-b368-ac1f349bf896\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" Apr 21 15:37:50.022646 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:50.022625 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1f66133-b065-4100-b368-ac1f349bf896-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zf8zt\" (UID: \"a1f66133-b065-4100-b368-ac1f349bf896\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" Apr 21 15:37:50.058576 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:50.058550 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-6qqm6\"" Apr 21 15:37:50.066668 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:50.066654 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" Apr 21 15:37:50.178244 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:50.178213 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt"] Apr 21 15:37:50.181132 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:37:50.181091 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1f66133_b065_4100_b368_ac1f349bf896.slice/crio-f1eeacc0bbcc4832b76f8eb495e6b5f87a8f629547c99600ef04d2b1bf66bcff WatchSource:0}: Error finding container f1eeacc0bbcc4832b76f8eb495e6b5f87a8f629547c99600ef04d2b1bf66bcff: Status 404 returned error can't find the container with id f1eeacc0bbcc4832b76f8eb495e6b5f87a8f629547c99600ef04d2b1bf66bcff Apr 21 15:37:51.060440 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:51.060399 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" event={"ID":"a1f66133-b065-4100-b368-ac1f349bf896","Type":"ContainerStarted","Data":"f1eeacc0bbcc4832b76f8eb495e6b5f87a8f629547c99600ef04d2b1bf66bcff"} Apr 21 15:37:52.064672 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:52.064636 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" event={"ID":"a1f66133-b065-4100-b368-ac1f349bf896","Type":"ContainerStarted","Data":"6d2bcc08fbe202c121abe7a5430682b0f92c45da916c12d7bdaefdf19dc0db6d"} Apr 21 15:37:52.085931 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:52.085888 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" podStartSLOduration=32.392742321 podStartE2EDuration="34.085873373s" podCreationTimestamp="2026-04-21 15:37:18 +0000 UTC" firstStartedPulling="2026-04-21 15:37:50.18282928 +0000 UTC m=+135.210996071" lastFinishedPulling="2026-04-21 15:37:51.87596033 +0000 UTC m=+136.904127123" observedRunningTime="2026-04-21 15:37:52.085540386 +0000 UTC m=+137.113707183" watchObservedRunningTime="2026-04-21 15:37:52.085873373 +0000 UTC m=+137.114040169" Apr 21 15:37:55.597094 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:55.597048 2579 patch_prober.go:28] interesting pod/image-registry-6686459d59-wxj7x container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 15:37:55.597528 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:55.597104 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6686459d59-wxj7x" podUID="5cbb7c2f-db4c-45f0-886d-922983c0ce02" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:37:59.604554 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:37:59.604517 2579 scope.go:117] "RemoveContainer" containerID="0685941fa7ac6328ac83bf696c6b644990d3eebc8cd2c5996db4e41cf4b04c46" Apr 21 15:37:59.604932 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:37:59.604746 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-kwhwh_openshift-console-operator(60c10748-f987-4f76-8f57-6a42bf9f4321)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" podUID="60c10748-f987-4f76-8f57-6a42bf9f4321" Apr 21 15:38:00.952141 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:00.952107 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-6xnbr"] Apr 21 15:38:00.956700 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:00.956681 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:00.960540 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:00.960513 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 15:38:00.960691 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:00.960542 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 15:38:00.960691 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:00.960548 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 15:38:00.960691 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:00.960613 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-qd8z8\"" Apr 21 15:38:00.960691 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:00.960523 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 15:38:00.969424 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:00.969401 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-6xnbr"] Apr 21 15:38:01.002537 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.002511 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-65cbt"] Apr 21 15:38:01.004668 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.004652 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.007984 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.007962 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 15:38:01.008108 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.007987 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 15:38:01.008257 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.008244 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 15:38:01.008817 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.008799 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-52fql\"" Apr 21 15:38:01.100664 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.100634 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3cb44a5b-124e-4afc-a061-3f8c0e6474db-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.100833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.100692 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3cb44a5b-124e-4afc-a061-3f8c0e6474db-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.100833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.100734 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3cb44a5b-124e-4afc-a061-3f8c0e6474db-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.100833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.100769 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/928583e4-43ad-4abd-acc5-eb09b449e3b7-sys\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.100833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.100796 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-wtmp\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.100833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.100827 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-accelerators-collector-config\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.101068 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.100916 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxtbq\" (UniqueName: \"kubernetes.io/projected/928583e4-43ad-4abd-acc5-eb09b449e3b7-kube-api-access-zxtbq\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.101068 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.100990 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3cb44a5b-124e-4afc-a061-3f8c0e6474db-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.101068 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.101021 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-textfile\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.101174 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.101086 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3cb44a5b-124e-4afc-a061-3f8c0e6474db-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.101174 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.101112 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-tls\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.101174 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.101131 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpvv4\" (UniqueName: \"kubernetes.io/projected/3cb44a5b-124e-4afc-a061-3f8c0e6474db-kube-api-access-jpvv4\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.101174 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.101145 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/928583e4-43ad-4abd-acc5-eb09b449e3b7-metrics-client-ca\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.101174 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.101163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/928583e4-43ad-4abd-acc5-eb09b449e3b7-root\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.101339 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.101180 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.201458 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.201428 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3cb44a5b-124e-4afc-a061-3f8c0e6474db-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.201583 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.201486 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3cb44a5b-124e-4afc-a061-3f8c0e6474db-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.201583 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.201516 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3cb44a5b-124e-4afc-a061-3f8c0e6474db-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.201583 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.201548 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/928583e4-43ad-4abd-acc5-eb09b449e3b7-sys\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.201583 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.201580 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-wtmp\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.201757 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.201607 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-accelerators-collector-config\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.201757 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.201662 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxtbq\" (UniqueName: \"kubernetes.io/projected/928583e4-43ad-4abd-acc5-eb09b449e3b7-kube-api-access-zxtbq\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.201757 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.201678 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/928583e4-43ad-4abd-acc5-eb09b449e3b7-sys\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.201757 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.201688 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3cb44a5b-124e-4afc-a061-3f8c0e6474db-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.201757 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.201728 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-textfile\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.202026 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.201761 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-wtmp\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.202026 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.201768 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3cb44a5b-124e-4afc-a061-3f8c0e6474db-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.202026 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.201797 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-tls\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.202026 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.201826 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jpvv4\" (UniqueName: \"kubernetes.io/projected/3cb44a5b-124e-4afc-a061-3f8c0e6474db-kube-api-access-jpvv4\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.202026 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.201851 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/928583e4-43ad-4abd-acc5-eb09b449e3b7-metrics-client-ca\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.202026 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.201882 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/928583e4-43ad-4abd-acc5-eb09b449e3b7-root\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.202026 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.201911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.202386 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.202307 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-textfile\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.202386 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.202351 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-accelerators-collector-config\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.202620 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:38:01.202599 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 15:38:01.202761 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.202741 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/928583e4-43ad-4abd-acc5-eb09b449e3b7-root\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.202866 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.202695 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3cb44a5b-124e-4afc-a061-3f8c0e6474db-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.202957 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:38:01.202926 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-tls podName:928583e4-43ad-4abd-acc5-eb09b449e3b7 nodeName:}" failed. No retries permitted until 2026-04-21 15:38:01.702841908 +0000 UTC m=+146.731008695 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-tls") pod "node-exporter-65cbt" (UID: "928583e4-43ad-4abd-acc5-eb09b449e3b7") : secret "node-exporter-tls" not found Apr 21 15:38:01.203047 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.203022 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3cb44a5b-124e-4afc-a061-3f8c0e6474db-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.203101 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.203062 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3cb44a5b-124e-4afc-a061-3f8c0e6474db-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.203341 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.203318 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/928583e4-43ad-4abd-acc5-eb09b449e3b7-metrics-client-ca\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.205140 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.205113 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.205215 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.205176 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3cb44a5b-124e-4afc-a061-3f8c0e6474db-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.215154 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.215129 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3cb44a5b-124e-4afc-a061-3f8c0e6474db-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.216262 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.216241 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxtbq\" (UniqueName: \"kubernetes.io/projected/928583e4-43ad-4abd-acc5-eb09b449e3b7-kube-api-access-zxtbq\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.216539 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.216515 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpvv4\" (UniqueName: \"kubernetes.io/projected/3cb44a5b-124e-4afc-a061-3f8c0e6474db-kube-api-access-jpvv4\") pod \"kube-state-metrics-69db897b98-6xnbr\" (UID: \"3cb44a5b-124e-4afc-a061-3f8c0e6474db\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.266391 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.266363 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" Apr 21 15:38:01.423281 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.423252 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-6xnbr"] Apr 21 15:38:01.426100 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:38:01.426060 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cb44a5b_124e_4afc_a061_3f8c0e6474db.slice/crio-d3af686ebd46b0c9648ba6e86091d1bcb3011a24c156ea2906870a9ca7991eec WatchSource:0}: Error finding container d3af686ebd46b0c9648ba6e86091d1bcb3011a24c156ea2906870a9ca7991eec: Status 404 returned error can't find the container with id d3af686ebd46b0c9648ba6e86091d1bcb3011a24c156ea2906870a9ca7991eec Apr 21 15:38:01.705795 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.705765 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-tls\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.708191 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.708170 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/928583e4-43ad-4abd-acc5-eb09b449e3b7-node-exporter-tls\") pod \"node-exporter-65cbt\" (UID: \"928583e4-43ad-4abd-acc5-eb09b449e3b7\") " pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.913765 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:01.913728 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-65cbt" Apr 21 15:38:01.923157 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:38:01.923113 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod928583e4_43ad_4abd_acc5_eb09b449e3b7.slice/crio-afb47d9863f5a8929cd91329186a3f79dfe26830dc76a267317e7d091c2784a4 WatchSource:0}: Error finding container afb47d9863f5a8929cd91329186a3f79dfe26830dc76a267317e7d091c2784a4: Status 404 returned error can't find the container with id afb47d9863f5a8929cd91329186a3f79dfe26830dc76a267317e7d091c2784a4 Apr 21 15:38:02.047029 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.046952 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:38:02.051097 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.051077 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.053807 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.053628 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 15:38:02.053807 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.053654 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 15:38:02.053807 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.053690 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 15:38:02.053807 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.053707 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 15:38:02.053807 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.053736 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 15:38:02.053807 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.053751 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 15:38:02.054175 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.053916 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 15:38:02.054175 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.054007 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 15:38:02.054175 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.054025 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-c5x4r\"" Apr 21 15:38:02.054175 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.054035 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 15:38:02.064427 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.064399 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:38:02.095802 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.095775 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-65cbt" event={"ID":"928583e4-43ad-4abd-acc5-eb09b449e3b7","Type":"ContainerStarted","Data":"afb47d9863f5a8929cd91329186a3f79dfe26830dc76a267317e7d091c2784a4"} Apr 21 15:38:02.097522 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.097500 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" event={"ID":"3cb44a5b-124e-4afc-a061-3f8c0e6474db","Type":"ContainerStarted","Data":"d3af686ebd46b0c9648ba6e86091d1bcb3011a24c156ea2906870a9ca7991eec"} Apr 21 15:38:02.210272 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.210243 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-config-volume\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.210424 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.210293 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ca089d8e-69ec-4c46-aded-046febde7982-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.210424 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.210325 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.210424 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.210349 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nrhp\" (UniqueName: \"kubernetes.io/projected/ca089d8e-69ec-4c46-aded-046febde7982-kube-api-access-7nrhp\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.210424 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.210386 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca089d8e-69ec-4c46-aded-046febde7982-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.210645 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.210442 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ca089d8e-69ec-4c46-aded-046febde7982-config-out\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.210645 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.210476 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.210645 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.210514 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.210645 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.210544 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-web-config\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.210645 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.210579 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.210645 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.210611 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ca089d8e-69ec-4c46-aded-046febde7982-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.210932 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.210644 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.210932 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.210692 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca089d8e-69ec-4c46-aded-046febde7982-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.311615 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.311505 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca089d8e-69ec-4c46-aded-046febde7982-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.311615 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.311585 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ca089d8e-69ec-4c46-aded-046febde7982-config-out\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.311828 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.311613 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.311828 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.311671 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.311828 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.311707 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-web-config\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.311828 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.311740 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.311828 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.311768 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ca089d8e-69ec-4c46-aded-046febde7982-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.311828 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.311798 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.312131 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.311841 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca089d8e-69ec-4c46-aded-046febde7982-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.312131 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.311866 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-config-volume\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.312131 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.311900 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ca089d8e-69ec-4c46-aded-046febde7982-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.312131 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.311928 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.312131 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.311973 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nrhp\" (UniqueName: \"kubernetes.io/projected/ca089d8e-69ec-4c46-aded-046febde7982-kube-api-access-7nrhp\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.312857 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.312534 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca089d8e-69ec-4c46-aded-046febde7982-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.312857 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.312794 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ca089d8e-69ec-4c46-aded-046febde7982-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.316239 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.316109 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.316239 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.316232 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.316239 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.316265 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.316723 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.316682 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca089d8e-69ec-4c46-aded-046febde7982-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.318115 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.318090 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-web-config\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.319752 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.319345 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.319752 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.319700 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.319927 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.319908 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ca089d8e-69ec-4c46-aded-046febde7982-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.320248 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.320205 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ca089d8e-69ec-4c46-aded-046febde7982-config-out\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.321200 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.321161 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nrhp\" (UniqueName: \"kubernetes.io/projected/ca089d8e-69ec-4c46-aded-046febde7982-kube-api-access-7nrhp\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.322600 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.322575 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-config-volume\") pod \"alertmanager-main-0\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.360607 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.360586 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:38:02.526586 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.522453 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:38:02.808649 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:38:02.808600 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca089d8e_69ec_4c46_aded_046febde7982.slice/crio-9e425eb9f257f90c427f4f20e8abf072db8deb3755e4ba1efb51cb5bfa7db13a WatchSource:0}: Error finding container 9e425eb9f257f90c427f4f20e8abf072db8deb3755e4ba1efb51cb5bfa7db13a: Status 404 returned error can't find the container with id 9e425eb9f257f90c427f4f20e8abf072db8deb3755e4ba1efb51cb5bfa7db13a Apr 21 15:38:02.961849 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.961806 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6dddb456dc-cz6f5"] Apr 21 15:38:02.965172 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.965156 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:02.969886 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.969844 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 15:38:02.970080 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.970061 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 15:38:02.970208 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.970177 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-dvki14qa17ja9\"" Apr 21 15:38:02.970387 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.970270 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 15:38:02.970387 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.970290 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 15:38:02.970609 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.970592 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 15:38:02.971034 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.971016 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-wlcvb\"" Apr 21 15:38:02.978337 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:02.978302 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6dddb456dc-cz6f5"] Apr 21 15:38:03.101883 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.101848 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-65cbt" event={"ID":"928583e4-43ad-4abd-acc5-eb09b449e3b7","Type":"ContainerStarted","Data":"916f6719914ba7aee07f710c7346fa27922d2ea35e88432ad4aff2db0ca53aa9"} Apr 21 15:38:03.103464 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.103429 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca089d8e-69ec-4c46-aded-046febde7982","Type":"ContainerStarted","Data":"9e425eb9f257f90c427f4f20e8abf072db8deb3755e4ba1efb51cb5bfa7db13a"} Apr 21 15:38:03.104895 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.104875 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" event={"ID":"3cb44a5b-124e-4afc-a061-3f8c0e6474db","Type":"ContainerStarted","Data":"7e731d9b5a964c57e409866687176f7f771409a79ba6e6d1d28961d630ac8afa"} Apr 21 15:38:03.119905 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.119828 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.119905 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.119896 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.120230 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.119977 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.120230 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.120006 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.120230 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.120128 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-thanos-querier-tls\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.120230 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.120164 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdnnm\" (UniqueName: \"kubernetes.io/projected/de1019eb-eae5-4deb-bb4c-327d43db9d13-kube-api-access-fdnnm\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.120230 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.120210 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de1019eb-eae5-4deb-bb4c-327d43db9d13-metrics-client-ca\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.120488 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.120255 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-grpc-tls\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.221543 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.221511 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-grpc-tls\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.221870 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.221822 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.222004 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.221887 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.222004 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.221973 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.222122 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.222002 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.222122 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.222093 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-thanos-querier-tls\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.222223 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.222132 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdnnm\" (UniqueName: \"kubernetes.io/projected/de1019eb-eae5-4deb-bb4c-327d43db9d13-kube-api-access-fdnnm\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.222223 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.222160 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de1019eb-eae5-4deb-bb4c-327d43db9d13-metrics-client-ca\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.223163 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.223118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de1019eb-eae5-4deb-bb4c-327d43db9d13-metrics-client-ca\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.225718 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.225664 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.226178 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.226136 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-thanos-querier-tls\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.226367 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.226340 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.226453 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.226400 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.226568 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.226519 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-grpc-tls\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.227331 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.227188 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/de1019eb-eae5-4deb-bb4c-327d43db9d13-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.235234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.235182 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdnnm\" (UniqueName: \"kubernetes.io/projected/de1019eb-eae5-4deb-bb4c-327d43db9d13-kube-api-access-fdnnm\") pod \"thanos-querier-6dddb456dc-cz6f5\" (UID: \"de1019eb-eae5-4deb-bb4c-327d43db9d13\") " pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.275099 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.275066 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:03.425185 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:03.425148 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6dddb456dc-cz6f5"] Apr 21 15:38:03.430534 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:38:03.429502 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde1019eb_eae5_4deb_bb4c_327d43db9d13.slice/crio-429b120cd2a2a506b5244ad1f817d02427f8ab3693c9d7978445e9475f9069f6 WatchSource:0}: Error finding container 429b120cd2a2a506b5244ad1f817d02427f8ab3693c9d7978445e9475f9069f6: Status 404 returned error can't find the container with id 429b120cd2a2a506b5244ad1f817d02427f8ab3693c9d7978445e9475f9069f6 Apr 21 15:38:04.110071 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:04.110030 2579 generic.go:358] "Generic (PLEG): container finished" podID="928583e4-43ad-4abd-acc5-eb09b449e3b7" containerID="916f6719914ba7aee07f710c7346fa27922d2ea35e88432ad4aff2db0ca53aa9" exitCode=0 Apr 21 15:38:04.110561 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:04.110120 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-65cbt" event={"ID":"928583e4-43ad-4abd-acc5-eb09b449e3b7","Type":"ContainerDied","Data":"916f6719914ba7aee07f710c7346fa27922d2ea35e88432ad4aff2db0ca53aa9"} Apr 21 15:38:04.111899 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:04.111877 2579 generic.go:358] "Generic (PLEG): container finished" podID="ca089d8e-69ec-4c46-aded-046febde7982" containerID="cb4ab7cfb46cc17424ad0eafa23a14b33b1659be6a150255ee8e5cb244eda439" exitCode=0 Apr 21 15:38:04.112030 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:04.111962 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca089d8e-69ec-4c46-aded-046febde7982","Type":"ContainerDied","Data":"cb4ab7cfb46cc17424ad0eafa23a14b33b1659be6a150255ee8e5cb244eda439"} Apr 21 15:38:04.113340 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:04.113314 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" event={"ID":"de1019eb-eae5-4deb-bb4c-327d43db9d13","Type":"ContainerStarted","Data":"429b120cd2a2a506b5244ad1f817d02427f8ab3693c9d7978445e9475f9069f6"} Apr 21 15:38:04.116646 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:04.116625 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" event={"ID":"3cb44a5b-124e-4afc-a061-3f8c0e6474db","Type":"ContainerStarted","Data":"316777e53d5128ecc62060241d05bdc7a8588fb24502e61c5d0edbea8261aa97"} Apr 21 15:38:04.116743 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:04.116654 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" event={"ID":"3cb44a5b-124e-4afc-a061-3f8c0e6474db","Type":"ContainerStarted","Data":"b47a2e40cc8b07a34c35eddbc1bb8ab265d58f8551f2f0662ca3b325e1821d56"} Apr 21 15:38:04.177404 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:04.177352 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-6xnbr" podStartSLOduration=2.5975172669999997 podStartE2EDuration="4.177331922s" podCreationTimestamp="2026-04-21 15:38:00 +0000 UTC" firstStartedPulling="2026-04-21 15:38:01.42800099 +0000 UTC m=+146.456167767" lastFinishedPulling="2026-04-21 15:38:03.007815634 +0000 UTC m=+148.035982422" observedRunningTime="2026-04-21 15:38:04.176189241 +0000 UTC m=+149.204356031" watchObservedRunningTime="2026-04-21 15:38:04.177331922 +0000 UTC m=+149.205498719" Apr 21 15:38:05.121705 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.121592 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-65cbt" event={"ID":"928583e4-43ad-4abd-acc5-eb09b449e3b7","Type":"ContainerStarted","Data":"26431421517f0218204d842035a372eb7238cef17c0e4829f9414eceacd6d594"} Apr 21 15:38:05.121705 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.121638 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-65cbt" event={"ID":"928583e4-43ad-4abd-acc5-eb09b449e3b7","Type":"ContainerStarted","Data":"b118cba513238f2363203fa3251909d1602ac95b0089964f12423f11a48a63f4"} Apr 21 15:38:05.147137 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.147079 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-65cbt" podStartSLOduration=4.062299398 podStartE2EDuration="5.14706095s" podCreationTimestamp="2026-04-21 15:38:00 +0000 UTC" firstStartedPulling="2026-04-21 15:38:01.92481565 +0000 UTC m=+146.952982428" lastFinishedPulling="2026-04-21 15:38:03.009577192 +0000 UTC m=+148.037743980" observedRunningTime="2026-04-21 15:38:05.145818994 +0000 UTC m=+150.173985790" watchObservedRunningTime="2026-04-21 15:38:05.14706095 +0000 UTC m=+150.175227747" Apr 21 15:38:05.578043 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.578010 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-945fb46b9-n7wkp"] Apr 21 15:38:05.581256 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.581227 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.584361 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.584336 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 15:38:05.584613 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.584597 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-1jkce675iihgj\"" Apr 21 15:38:05.584714 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.584696 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 15:38:05.584760 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.584696 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 15:38:05.585153 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.585134 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 15:38:05.585252 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.585220 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-rhc4l\"" Apr 21 15:38:05.595225 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.595203 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-945fb46b9-n7wkp"] Apr 21 15:38:05.595447 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.595431 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:38:05.723379 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.723136 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jj9dg"] Apr 21 15:38:05.727435 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.727398 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jj9dg" Apr 21 15:38:05.730115 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.730074 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 15:38:05.730243 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.730167 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-ssst5\"" Apr 21 15:38:05.733864 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.733838 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jj9dg"] Apr 21 15:38:05.743280 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.743252 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wnkn\" (UniqueName: \"kubernetes.io/projected/c7328b78-bba6-45e4-a754-8ac5f6456e78-kube-api-access-8wnkn\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.743397 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.743311 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c7328b78-bba6-45e4-a754-8ac5f6456e78-audit-log\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.743397 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.743340 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c7328b78-bba6-45e4-a754-8ac5f6456e78-secret-metrics-server-tls\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.743397 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.743392 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c7328b78-bba6-45e4-a754-8ac5f6456e78-secret-metrics-server-client-certs\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.743531 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.743425 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c7328b78-bba6-45e4-a754-8ac5f6456e78-metrics-server-audit-profiles\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.743531 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.743459 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7328b78-bba6-45e4-a754-8ac5f6456e78-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.743531 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.743491 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7328b78-bba6-45e4-a754-8ac5f6456e78-client-ca-bundle\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.844171 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.844139 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wnkn\" (UniqueName: \"kubernetes.io/projected/c7328b78-bba6-45e4-a754-8ac5f6456e78-kube-api-access-8wnkn\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.844375 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.844183 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/16063ffa-ebfd-4360-88c3-de39e192699f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jj9dg\" (UID: \"16063ffa-ebfd-4360-88c3-de39e192699f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jj9dg" Apr 21 15:38:05.844375 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.844216 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c7328b78-bba6-45e4-a754-8ac5f6456e78-audit-log\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.844375 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.844360 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c7328b78-bba6-45e4-a754-8ac5f6456e78-secret-metrics-server-tls\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.844511 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.844415 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c7328b78-bba6-45e4-a754-8ac5f6456e78-secret-metrics-server-client-certs\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.844511 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.844439 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c7328b78-bba6-45e4-a754-8ac5f6456e78-metrics-server-audit-profiles\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.844511 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.844473 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7328b78-bba6-45e4-a754-8ac5f6456e78-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.844511 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.844494 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7328b78-bba6-45e4-a754-8ac5f6456e78-client-ca-bundle\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.844636 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.844560 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c7328b78-bba6-45e4-a754-8ac5f6456e78-audit-log\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.845707 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.845671 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7328b78-bba6-45e4-a754-8ac5f6456e78-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.846165 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.846137 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c7328b78-bba6-45e4-a754-8ac5f6456e78-metrics-server-audit-profiles\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.847292 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.847269 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c7328b78-bba6-45e4-a754-8ac5f6456e78-secret-metrics-server-client-certs\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.847452 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.847435 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7328b78-bba6-45e4-a754-8ac5f6456e78-client-ca-bundle\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.847539 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.847463 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c7328b78-bba6-45e4-a754-8ac5f6456e78-secret-metrics-server-tls\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.853310 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.853290 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wnkn\" (UniqueName: \"kubernetes.io/projected/c7328b78-bba6-45e4-a754-8ac5f6456e78-kube-api-access-8wnkn\") pod \"metrics-server-945fb46b9-n7wkp\" (UID: \"c7328b78-bba6-45e4-a754-8ac5f6456e78\") " pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.891100 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.891060 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:05.945874 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.945835 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/16063ffa-ebfd-4360-88c3-de39e192699f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jj9dg\" (UID: \"16063ffa-ebfd-4360-88c3-de39e192699f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jj9dg" Apr 21 15:38:05.948454 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:05.948433 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/16063ffa-ebfd-4360-88c3-de39e192699f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jj9dg\" (UID: \"16063ffa-ebfd-4360-88c3-de39e192699f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jj9dg" Apr 21 15:38:06.039456 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:06.039417 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jj9dg" Apr 21 15:38:06.139990 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:06.137474 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-945fb46b9-n7wkp"] Apr 21 15:38:06.140863 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:06.140776 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" event={"ID":"de1019eb-eae5-4deb-bb4c-327d43db9d13","Type":"ContainerStarted","Data":"68f562673304d69dfce045a67fd6174c8c8a6eaab4fb16374d546773a80f3152"} Apr 21 15:38:06.140863 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:06.140824 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" event={"ID":"de1019eb-eae5-4deb-bb4c-327d43db9d13","Type":"ContainerStarted","Data":"8421a471ab8312cef87f569372f142168eb6a85a545e85b0656ba2b0392514cd"} Apr 21 15:38:06.140863 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:06.140841 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" event={"ID":"de1019eb-eae5-4deb-bb4c-327d43db9d13","Type":"ContainerStarted","Data":"fd46bf4a3b819b174bc606ef45951da5a5ffb41655dd4367fb68f2337cf70706"} Apr 21 15:38:06.194058 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:06.194021 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jj9dg"] Apr 21 15:38:06.198003 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:38:06.197624 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16063ffa_ebfd_4360_88c3_de39e192699f.slice/crio-cb5c8c36306d708d5ba1f9e729af6e91a24e492fbce1c110016baa75baa5875c WatchSource:0}: Error finding container cb5c8c36306d708d5ba1f9e729af6e91a24e492fbce1c110016baa75baa5875c: Status 404 returned error can't find the container with id cb5c8c36306d708d5ba1f9e729af6e91a24e492fbce1c110016baa75baa5875c Apr 21 15:38:07.144996 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.144929 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jj9dg" event={"ID":"16063ffa-ebfd-4360-88c3-de39e192699f","Type":"ContainerStarted","Data":"cb5c8c36306d708d5ba1f9e729af6e91a24e492fbce1c110016baa75baa5875c"} Apr 21 15:38:07.148821 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.148710 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca089d8e-69ec-4c46-aded-046febde7982","Type":"ContainerStarted","Data":"c8c4c9570870b39ae967bb79009c33acd27ea8aac8246a60471bed0bf7ff8065"} Apr 21 15:38:07.148821 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.148751 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca089d8e-69ec-4c46-aded-046febde7982","Type":"ContainerStarted","Data":"80bc93bcbee8719ccc023daa27be2ca11ed54d7c776b49bb5f688b566ca3891c"} Apr 21 15:38:07.148821 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.148764 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca089d8e-69ec-4c46-aded-046febde7982","Type":"ContainerStarted","Data":"4c2f4d47e0d6cbff8042f7eada059ba786b11995d29c9eafd8f313b4c8f3849a"} Apr 21 15:38:07.148821 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.148776 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca089d8e-69ec-4c46-aded-046febde7982","Type":"ContainerStarted","Data":"7675ac7bb0ff6a88087cfcd1c9535dd0b911531ea436946f158bac4f9a45fac2"} Apr 21 15:38:07.148821 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.148789 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca089d8e-69ec-4c46-aded-046febde7982","Type":"ContainerStarted","Data":"eeeb7e828470e1ff0da00e07767ec06f58dbe3f42f69eca89423a86db7a4b956"} Apr 21 15:38:07.148821 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.148800 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca089d8e-69ec-4c46-aded-046febde7982","Type":"ContainerStarted","Data":"430f82fecb254eea7723aa14b42750d9154ec4de324a749946bec2799c06229d"} Apr 21 15:38:07.152002 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.151877 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" event={"ID":"de1019eb-eae5-4deb-bb4c-327d43db9d13","Type":"ContainerStarted","Data":"81a6dae3aea1cdd4e7d44bb7d1ce9c315e1dc6eb350b82f5e7943ec7ffa1ccf8"} Apr 21 15:38:07.152002 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.151915 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" event={"ID":"de1019eb-eae5-4deb-bb4c-327d43db9d13","Type":"ContainerStarted","Data":"3e2c18ec9e553e85e43223bf75bb9d14549b4485280b5cfba3141eb7533e7074"} Apr 21 15:38:07.152002 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.151931 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" event={"ID":"de1019eb-eae5-4deb-bb4c-327d43db9d13","Type":"ContainerStarted","Data":"23747aeb111d0fe8bca61675c0027df1438101dd5b1765bcb5810c487ac66858"} Apr 21 15:38:07.152235 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.152094 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:07.153373 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.153341 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" event={"ID":"c7328b78-bba6-45e4-a754-8ac5f6456e78","Type":"ContainerStarted","Data":"b87f4993e82ca07f2a7380994f05022e2cf0a9ec63feac2b6aa3faed1bcd1499"} Apr 21 15:38:07.191277 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.190301 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.9501517910000001 podStartE2EDuration="5.19028241s" podCreationTimestamp="2026-04-21 15:38:02 +0000 UTC" firstStartedPulling="2026-04-21 15:38:02.810455986 +0000 UTC m=+147.838622762" lastFinishedPulling="2026-04-21 15:38:06.050586606 +0000 UTC m=+151.078753381" observedRunningTime="2026-04-21 15:38:07.189532534 +0000 UTC m=+152.217699331" watchObservedRunningTime="2026-04-21 15:38:07.19028241 +0000 UTC m=+152.218449205" Apr 21 15:38:07.230244 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.230172 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" podStartSLOduration=2.116590312 podStartE2EDuration="5.230155897s" podCreationTimestamp="2026-04-21 15:38:02 +0000 UTC" firstStartedPulling="2026-04-21 15:38:03.432022042 +0000 UTC m=+148.460188820" lastFinishedPulling="2026-04-21 15:38:06.545587632 +0000 UTC m=+151.573754405" observedRunningTime="2026-04-21 15:38:07.227901009 +0000 UTC m=+152.256067808" watchObservedRunningTime="2026-04-21 15:38:07.230155897 +0000 UTC m=+152.258322694" Apr 21 15:38:07.295654 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.295618 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:38:07.300900 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.300864 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.303504 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.303475 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 15:38:07.303646 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.303633 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 15:38:07.303960 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.303860 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 15:38:07.304106 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.303992 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 15:38:07.304188 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.304115 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-rfndn\"" Apr 21 15:38:07.304188 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.304127 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 15:38:07.304188 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.304122 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 15:38:07.304329 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.304249 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 15:38:07.304767 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.304681 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 15:38:07.305144 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.304927 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 15:38:07.305300 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.305274 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 15:38:07.305742 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.305724 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 15:38:07.305843 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.305741 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-73cjef5q5dpua\"" Apr 21 15:38:07.308131 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.308013 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 15:38:07.317690 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.317661 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:38:07.365935 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.365900 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-config\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.366128 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.365969 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-config-out\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.366128 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.365996 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.366128 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.366023 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.366128 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.366072 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.366128 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.366115 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-web-config\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.366302 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.366158 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.366302 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.366224 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trzfq\" (UniqueName: \"kubernetes.io/projected/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-kube-api-access-trzfq\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.366302 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.366264 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.366420 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.366321 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.366420 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.366346 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.366420 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.366404 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.366554 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.366440 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.366554 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.366466 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.366554 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.366504 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.366554 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.366549 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.366749 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.366577 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.366749 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.366633 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.467623 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.467534 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.467623 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.467607 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-config\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.467860 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.467635 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-config-out\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.467860 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.467661 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.467860 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.467689 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.467860 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.467725 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.467860 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.467748 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-web-config\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.467860 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.467779 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.467860 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.467826 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trzfq\" (UniqueName: \"kubernetes.io/projected/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-kube-api-access-trzfq\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.467860 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.467854 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.468346 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.467894 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.468346 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.467920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.468346 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.467972 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.468346 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.468004 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.468346 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.468042 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.468346 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.468076 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.468346 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.468123 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.468346 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.468149 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.468726 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.468584 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.468726 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.468601 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.469000 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.468854 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.470224 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.469899 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.471763 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.470782 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-config-out\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.471763 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.471435 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.471763 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.471580 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.471763 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.471587 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-config\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.471763 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.471644 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.472853 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.472690 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-web-config\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.474975 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.474535 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.474975 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.474561 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.474975 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.474817 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.474975 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.474844 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.474975 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.474862 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.475291 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.475269 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.476140 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.476116 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.480583 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.480564 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trzfq\" (UniqueName: \"kubernetes.io/projected/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-kube-api-access-trzfq\") pod \"prometheus-k8s-0\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.619008 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.618976 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:07.975730 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:07.975696 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:38:07.979442 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:38:07.979408 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d57ff44_23c3_46a1_aeb8_61fb85c14a53.slice/crio-75076d41df3c11a6a98e2855daa28df13e165cc054b747d097694c8dff2f2323 WatchSource:0}: Error finding container 75076d41df3c11a6a98e2855daa28df13e165cc054b747d097694c8dff2f2323: Status 404 returned error can't find the container with id 75076d41df3c11a6a98e2855daa28df13e165cc054b747d097694c8dff2f2323 Apr 21 15:38:08.157673 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:08.157582 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" event={"ID":"c7328b78-bba6-45e4-a754-8ac5f6456e78","Type":"ContainerStarted","Data":"5d459fb98d5b93f6e0b4b21ebfec2468afb0086d51bae1dc9bd514faef27b8f8"} Apr 21 15:38:08.158931 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:08.158905 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jj9dg" event={"ID":"16063ffa-ebfd-4360-88c3-de39e192699f","Type":"ContainerStarted","Data":"a5a52b26a4db0e378527a6af58d9dc572852bcc8fd20daf28e6b1a60ff3d76ae"} Apr 21 15:38:08.159116 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:08.159068 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jj9dg" Apr 21 15:38:08.160318 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:08.160295 2579 generic.go:358] "Generic (PLEG): container finished" podID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerID="070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6" exitCode=0 Apr 21 15:38:08.160415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:08.160377 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9d57ff44-23c3-46a1-aeb8-61fb85c14a53","Type":"ContainerDied","Data":"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6"} Apr 21 15:38:08.160415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:08.160409 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9d57ff44-23c3-46a1-aeb8-61fb85c14a53","Type":"ContainerStarted","Data":"75076d41df3c11a6a98e2855daa28df13e165cc054b747d097694c8dff2f2323"} Apr 21 15:38:08.164458 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:08.164443 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jj9dg" Apr 21 15:38:08.185309 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:08.185231 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" podStartSLOduration=1.499622935 podStartE2EDuration="3.185217161s" podCreationTimestamp="2026-04-21 15:38:05 +0000 UTC" firstStartedPulling="2026-04-21 15:38:06.150764392 +0000 UTC m=+151.178931171" lastFinishedPulling="2026-04-21 15:38:07.836358606 +0000 UTC m=+152.864525397" observedRunningTime="2026-04-21 15:38:08.185154748 +0000 UTC m=+153.213321546" watchObservedRunningTime="2026-04-21 15:38:08.185217161 +0000 UTC m=+153.213383958" Apr 21 15:38:08.266727 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:08.266669 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jj9dg" podStartSLOduration=1.632339667 podStartE2EDuration="3.266651833s" podCreationTimestamp="2026-04-21 15:38:05 +0000 UTC" firstStartedPulling="2026-04-21 15:38:06.200140029 +0000 UTC m=+151.228306802" lastFinishedPulling="2026-04-21 15:38:07.834452178 +0000 UTC m=+152.862618968" observedRunningTime="2026-04-21 15:38:08.265981478 +0000 UTC m=+153.294148278" watchObservedRunningTime="2026-04-21 15:38:08.266651833 +0000 UTC m=+153.294818632" Apr 21 15:38:10.616216 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:10.616146 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6686459d59-wxj7x" podUID="5cbb7c2f-db4c-45f0-886d-922983c0ce02" containerName="registry" containerID="cri-o://13e67d9cddaf8b90e9e1bcebf4224044b9c3c4f91c80c90076992a9cbc6e2e9a" gracePeriod=30 Apr 21 15:38:10.942672 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:10.942645 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:38:11.001414 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.001212 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5cbb7c2f-db4c-45f0-886d-922983c0ce02-image-registry-private-configuration\") pod \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " Apr 21 15:38:11.001414 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.001254 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cbb7c2f-db4c-45f0-886d-922983c0ce02-trusted-ca\") pod \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " Apr 21 15:38:11.001414 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.001285 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-bound-sa-token\") pod \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " Apr 21 15:38:11.001414 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.001340 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5cbb7c2f-db4c-45f0-886d-922983c0ce02-ca-trust-extracted\") pod \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " Apr 21 15:38:11.001646 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.001427 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-certificates\") pod \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " Apr 21 15:38:11.001646 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.001573 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5cbb7c2f-db4c-45f0-886d-922983c0ce02-installation-pull-secrets\") pod \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " Apr 21 15:38:11.001646 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.001623 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls\") pod \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " Apr 21 15:38:11.001646 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.001646 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6945m\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-kube-api-access-6945m\") pod \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\" (UID: \"5cbb7c2f-db4c-45f0-886d-922983c0ce02\") " Apr 21 15:38:11.002890 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.002421 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cbb7c2f-db4c-45f0-886d-922983c0ce02-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5cbb7c2f-db4c-45f0-886d-922983c0ce02" (UID: "5cbb7c2f-db4c-45f0-886d-922983c0ce02"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:38:11.002890 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.002515 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5cbb7c2f-db4c-45f0-886d-922983c0ce02" (UID: "5cbb7c2f-db4c-45f0-886d-922983c0ce02"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:38:11.004504 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.004460 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cbb7c2f-db4c-45f0-886d-922983c0ce02-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5cbb7c2f-db4c-45f0-886d-922983c0ce02" (UID: "5cbb7c2f-db4c-45f0-886d-922983c0ce02"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:38:11.005166 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.004884 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-kube-api-access-6945m" (OuterVolumeSpecName: "kube-api-access-6945m") pod "5cbb7c2f-db4c-45f0-886d-922983c0ce02" (UID: "5cbb7c2f-db4c-45f0-886d-922983c0ce02"). InnerVolumeSpecName "kube-api-access-6945m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:38:11.005166 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.005009 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cbb7c2f-db4c-45f0-886d-922983c0ce02-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5cbb7c2f-db4c-45f0-886d-922983c0ce02" (UID: "5cbb7c2f-db4c-45f0-886d-922983c0ce02"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:38:11.005314 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.005239 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5cbb7c2f-db4c-45f0-886d-922983c0ce02" (UID: "5cbb7c2f-db4c-45f0-886d-922983c0ce02"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:38:11.005749 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.005727 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5cbb7c2f-db4c-45f0-886d-922983c0ce02" (UID: "5cbb7c2f-db4c-45f0-886d-922983c0ce02"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:38:11.013459 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.013433 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cbb7c2f-db4c-45f0-886d-922983c0ce02-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5cbb7c2f-db4c-45f0-886d-922983c0ce02" (UID: "5cbb7c2f-db4c-45f0-886d-922983c0ce02"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:38:11.104013 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.103421 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-tls\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:38:11.104013 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.103454 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6945m\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-kube-api-access-6945m\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:38:11.104013 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.103470 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5cbb7c2f-db4c-45f0-886d-922983c0ce02-image-registry-private-configuration\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:38:11.104013 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.103484 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cbb7c2f-db4c-45f0-886d-922983c0ce02-trusted-ca\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:38:11.104013 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.103497 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5cbb7c2f-db4c-45f0-886d-922983c0ce02-bound-sa-token\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:38:11.104013 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.103510 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5cbb7c2f-db4c-45f0-886d-922983c0ce02-ca-trust-extracted\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:38:11.104013 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.103522 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5cbb7c2f-db4c-45f0-886d-922983c0ce02-registry-certificates\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:38:11.104013 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.103535 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5cbb7c2f-db4c-45f0-886d-922983c0ce02-installation-pull-secrets\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:38:11.175110 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.175078 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9d57ff44-23c3-46a1-aeb8-61fb85c14a53","Type":"ContainerStarted","Data":"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91"} Apr 21 15:38:11.175198 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.175124 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9d57ff44-23c3-46a1-aeb8-61fb85c14a53","Type":"ContainerStarted","Data":"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411"} Apr 21 15:38:11.175198 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.175139 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9d57ff44-23c3-46a1-aeb8-61fb85c14a53","Type":"ContainerStarted","Data":"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32"} Apr 21 15:38:11.175198 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.175154 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9d57ff44-23c3-46a1-aeb8-61fb85c14a53","Type":"ContainerStarted","Data":"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2"} Apr 21 15:38:11.176623 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.176580 2579 generic.go:358] "Generic (PLEG): container finished" podID="5cbb7c2f-db4c-45f0-886d-922983c0ce02" containerID="13e67d9cddaf8b90e9e1bcebf4224044b9c3c4f91c80c90076992a9cbc6e2e9a" exitCode=0 Apr 21 15:38:11.176728 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.176644 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6686459d59-wxj7x" Apr 21 15:38:11.176780 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.176651 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6686459d59-wxj7x" event={"ID":"5cbb7c2f-db4c-45f0-886d-922983c0ce02","Type":"ContainerDied","Data":"13e67d9cddaf8b90e9e1bcebf4224044b9c3c4f91c80c90076992a9cbc6e2e9a"} Apr 21 15:38:11.176780 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.176761 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6686459d59-wxj7x" event={"ID":"5cbb7c2f-db4c-45f0-886d-922983c0ce02","Type":"ContainerDied","Data":"cb2e4478e8e416198e033916cd3774f02672503ff2cab87c91d7b2a57a76c136"} Apr 21 15:38:11.176878 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.176786 2579 scope.go:117] "RemoveContainer" containerID="13e67d9cddaf8b90e9e1bcebf4224044b9c3c4f91c80c90076992a9cbc6e2e9a" Apr 21 15:38:11.186529 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.186494 2579 scope.go:117] "RemoveContainer" containerID="13e67d9cddaf8b90e9e1bcebf4224044b9c3c4f91c80c90076992a9cbc6e2e9a" Apr 21 15:38:11.186787 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:38:11.186761 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13e67d9cddaf8b90e9e1bcebf4224044b9c3c4f91c80c90076992a9cbc6e2e9a\": container with ID starting with 13e67d9cddaf8b90e9e1bcebf4224044b9c3c4f91c80c90076992a9cbc6e2e9a not found: ID does not exist" containerID="13e67d9cddaf8b90e9e1bcebf4224044b9c3c4f91c80c90076992a9cbc6e2e9a" Apr 21 15:38:11.186861 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.186800 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13e67d9cddaf8b90e9e1bcebf4224044b9c3c4f91c80c90076992a9cbc6e2e9a"} err="failed to get container status \"13e67d9cddaf8b90e9e1bcebf4224044b9c3c4f91c80c90076992a9cbc6e2e9a\": rpc error: code = NotFound desc = could not find container \"13e67d9cddaf8b90e9e1bcebf4224044b9c3c4f91c80c90076992a9cbc6e2e9a\": container with ID starting with 13e67d9cddaf8b90e9e1bcebf4224044b9c3c4f91c80c90076992a9cbc6e2e9a not found: ID does not exist" Apr 21 15:38:11.219175 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.219150 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6686459d59-wxj7x"] Apr 21 15:38:11.226664 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.226646 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6686459d59-wxj7x"] Apr 21 15:38:11.410379 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:38:11.410290 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-dgpm6" podUID="704d38f9-6323-48bf-b8f7-977c83275b82" Apr 21 15:38:11.427867 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:38:11.427833 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-klb4g" podUID="dff5e891-a3c3-4526-94e0-f1c91d517e9d" Apr 21 15:38:11.610435 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:11.610403 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cbb7c2f-db4c-45f0-886d-922983c0ce02" path="/var/lib/kubelet/pods/5cbb7c2f-db4c-45f0-886d-922983c0ce02/volumes" Apr 21 15:38:12.183751 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:12.183711 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9d57ff44-23c3-46a1-aeb8-61fb85c14a53","Type":"ContainerStarted","Data":"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a"} Apr 21 15:38:12.183751 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:12.183736 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dgpm6" Apr 21 15:38:12.183751 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:12.183754 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-klb4g" Apr 21 15:38:12.184350 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:12.183754 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9d57ff44-23c3-46a1-aeb8-61fb85c14a53","Type":"ContainerStarted","Data":"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc"} Apr 21 15:38:12.221547 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:12.221507 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.540198815 podStartE2EDuration="5.22149443s" podCreationTimestamp="2026-04-21 15:38:07 +0000 UTC" firstStartedPulling="2026-04-21 15:38:08.161864979 +0000 UTC m=+153.190031758" lastFinishedPulling="2026-04-21 15:38:10.843160581 +0000 UTC m=+155.871327373" observedRunningTime="2026-04-21 15:38:12.219623522 +0000 UTC m=+157.247790322" watchObservedRunningTime="2026-04-21 15:38:12.22149443 +0000 UTC m=+157.249661273" Apr 21 15:38:12.619210 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:12.619173 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:38:13.166688 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:13.166664 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6dddb456dc-cz6f5" Apr 21 15:38:14.228460 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:38:14.228369 2579 configmap.go:193] Couldn't get configMap openshift-monitoring/prometheus-k8s-rulefiles-0: configmap "prometheus-k8s-rulefiles-0" not found Apr 21 15:38:14.228866 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:38:14.228571 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-k8s-rulefiles-0 podName:9d57ff44-23c3-46a1-aeb8-61fb85c14a53 nodeName:}" failed. No retries permitted until 2026-04-21 15:38:14.728428798 +0000 UTC m=+159.756595588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-k8s-rulefiles-0" (UniqueName: "kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-k8s-rulefiles-0") pod "prometheus-k8s-0" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53") : configmap "prometheus-k8s-rulefiles-0" not found Apr 21 15:38:14.604858 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:14.604830 2579 scope.go:117] "RemoveContainer" containerID="0685941fa7ac6328ac83bf696c6b644990d3eebc8cd2c5996db4e41cf4b04c46" Apr 21 15:38:15.194666 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:15.194640 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/2.log" Apr 21 15:38:15.194819 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:15.194776 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" event={"ID":"60c10748-f987-4f76-8f57-6a42bf9f4321","Type":"ContainerStarted","Data":"c0eb4108ff4acc5dda9784e3f0d1803e08b8ff6432db2bea93b0dfb68a749fbc"} Apr 21 15:38:15.195197 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:15.195175 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:38:15.199789 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:15.199765 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" Apr 21 15:38:15.224857 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:15.224809 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-kwhwh" podStartSLOduration=55.353677119 podStartE2EDuration="57.224794546s" podCreationTimestamp="2026-04-21 15:37:18 +0000 UTC" firstStartedPulling="2026-04-21 15:37:19.051091068 +0000 UTC m=+104.079257842" lastFinishedPulling="2026-04-21 15:37:20.922208492 +0000 UTC m=+105.950375269" observedRunningTime="2026-04-21 15:38:15.224145827 +0000 UTC m=+160.252328096" watchObservedRunningTime="2026-04-21 15:38:15.224794546 +0000 UTC m=+160.252961340" Apr 21 15:38:16.245249 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:16.245205 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:38:16.245705 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:16.245353 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert\") pod \"ingress-canary-klb4g\" (UID: \"dff5e891-a3c3-4526-94e0-f1c91d517e9d\") " pod="openshift-ingress-canary/ingress-canary-klb4g" Apr 21 15:38:16.247630 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:16.247607 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/704d38f9-6323-48bf-b8f7-977c83275b82-metrics-tls\") pod \"dns-default-dgpm6\" (UID: \"704d38f9-6323-48bf-b8f7-977c83275b82\") " pod="openshift-dns/dns-default-dgpm6" Apr 21 15:38:16.247848 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:16.247827 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dff5e891-a3c3-4526-94e0-f1c91d517e9d-cert\") pod \"ingress-canary-klb4g\" (UID: \"dff5e891-a3c3-4526-94e0-f1c91d517e9d\") " pod="openshift-ingress-canary/ingress-canary-klb4g" Apr 21 15:38:16.388706 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:16.388671 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xkwb5\"" Apr 21 15:38:16.389512 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:16.389497 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7ld2z\"" Apr 21 15:38:16.395488 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:16.395473 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dgpm6" Apr 21 15:38:16.395599 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:16.395576 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-klb4g" Apr 21 15:38:16.758721 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:16.758693 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-klb4g"] Apr 21 15:38:16.763262 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:38:16.762296 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddff5e891_a3c3_4526_94e0_f1c91d517e9d.slice/crio-a2c424664c330d4b51772ff1b54a05b6f5cb085564d80d6046da5ea9b382357d WatchSource:0}: Error finding container a2c424664c330d4b51772ff1b54a05b6f5cb085564d80d6046da5ea9b382357d: Status 404 returned error can't find the container with id a2c424664c330d4b51772ff1b54a05b6f5cb085564d80d6046da5ea9b382357d Apr 21 15:38:16.763262 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:16.763215 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dgpm6"] Apr 21 15:38:16.767518 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:38:16.767482 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod704d38f9_6323_48bf_b8f7_977c83275b82.slice/crio-f1b704536a526b05df3c242233fbd54605260a1662f5ae5a46021406694209ec WatchSource:0}: Error finding container f1b704536a526b05df3c242233fbd54605260a1662f5ae5a46021406694209ec: Status 404 returned error can't find the container with id f1b704536a526b05df3c242233fbd54605260a1662f5ae5a46021406694209ec Apr 21 15:38:17.201663 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:17.201623 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-klb4g" event={"ID":"dff5e891-a3c3-4526-94e0-f1c91d517e9d","Type":"ContainerStarted","Data":"a2c424664c330d4b51772ff1b54a05b6f5cb085564d80d6046da5ea9b382357d"} Apr 21 15:38:17.204761 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:17.203657 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dgpm6" event={"ID":"704d38f9-6323-48bf-b8f7-977c83275b82","Type":"ContainerStarted","Data":"f1b704536a526b05df3c242233fbd54605260a1662f5ae5a46021406694209ec"} Apr 21 15:38:19.211718 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:19.211617 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-klb4g" event={"ID":"dff5e891-a3c3-4526-94e0-f1c91d517e9d","Type":"ContainerStarted","Data":"f9c7528a7b73dc2fa9d812786590705de5ee1e7ba831cab8fc537d1b81827f58"} Apr 21 15:38:19.213205 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:19.213182 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dgpm6" event={"ID":"704d38f9-6323-48bf-b8f7-977c83275b82","Type":"ContainerStarted","Data":"5223846ca875c5fee7842c46a32ca8e2402b0480d3eff2b0f01e69627fe35df9"} Apr 21 15:38:19.213205 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:19.213208 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dgpm6" event={"ID":"704d38f9-6323-48bf-b8f7-977c83275b82","Type":"ContainerStarted","Data":"56de3d0cab09729e6e3b111b5554e29311a34e84970f3a78c2cf6b20afc59700"} Apr 21 15:38:19.213323 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:19.213297 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-dgpm6" Apr 21 15:38:19.230066 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:19.230024 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-klb4g" podStartSLOduration=129.112448671 podStartE2EDuration="2m11.230011455s" podCreationTimestamp="2026-04-21 15:36:08 +0000 UTC" firstStartedPulling="2026-04-21 15:38:16.764485334 +0000 UTC m=+161.792652123" lastFinishedPulling="2026-04-21 15:38:18.882048132 +0000 UTC m=+163.910214907" observedRunningTime="2026-04-21 15:38:19.229205712 +0000 UTC m=+164.257372508" watchObservedRunningTime="2026-04-21 15:38:19.230011455 +0000 UTC m=+164.258178251" Apr 21 15:38:19.250546 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:19.250500 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dgpm6" podStartSLOduration=129.704105786 podStartE2EDuration="2m11.25048591s" podCreationTimestamp="2026-04-21 15:36:08 +0000 UTC" firstStartedPulling="2026-04-21 15:38:16.769394652 +0000 UTC m=+161.797561439" lastFinishedPulling="2026-04-21 15:38:18.31577479 +0000 UTC m=+163.343941563" observedRunningTime="2026-04-21 15:38:19.248973869 +0000 UTC m=+164.277140691" watchObservedRunningTime="2026-04-21 15:38:19.25048591 +0000 UTC m=+164.278652706" Apr 21 15:38:25.891351 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:25.891324 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:25.891688 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:25.891382 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:29.218281 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:29.218253 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dgpm6" Apr 21 15:38:43.549506 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:43.549474 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ca089d8e-69ec-4c46-aded-046febde7982/init-config-reloader/0.log" Apr 21 15:38:43.557003 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:43.556977 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ca089d8e-69ec-4c46-aded-046febde7982/alertmanager/0.log" Apr 21 15:38:43.646080 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:43.646058 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ca089d8e-69ec-4c46-aded-046febde7982/config-reloader/0.log" Apr 21 15:38:43.846884 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:43.846855 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ca089d8e-69ec-4c46-aded-046febde7982/kube-rbac-proxy-web/0.log" Apr 21 15:38:44.047446 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:44.047422 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ca089d8e-69ec-4c46-aded-046febde7982/kube-rbac-proxy/0.log" Apr 21 15:38:44.246767 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:44.246688 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ca089d8e-69ec-4c46-aded-046febde7982/kube-rbac-proxy-metric/0.log" Apr 21 15:38:44.445863 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:44.445835 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ca089d8e-69ec-4c46-aded-046febde7982/prom-label-proxy/0.log" Apr 21 15:38:44.650134 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:44.650105 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zf8zt_a1f66133-b065-4100-b368-ac1f349bf896/cluster-monitoring-operator/0.log" Apr 21 15:38:44.849279 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:44.849245 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-6xnbr_3cb44a5b-124e-4afc-a061-3f8c0e6474db/kube-state-metrics/0.log" Apr 21 15:38:45.047088 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:45.047019 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-6xnbr_3cb44a5b-124e-4afc-a061-3f8c0e6474db/kube-rbac-proxy-main/0.log" Apr 21 15:38:45.250782 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:45.250753 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-6xnbr_3cb44a5b-124e-4afc-a061-3f8c0e6474db/kube-rbac-proxy-self/0.log" Apr 21 15:38:45.452320 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:45.452297 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-945fb46b9-n7wkp_c7328b78-bba6-45e4-a754-8ac5f6456e78/metrics-server/0.log" Apr 21 15:38:45.648572 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:45.648545 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-jj9dg_16063ffa-ebfd-4360-88c3-de39e192699f/monitoring-plugin/0.log" Apr 21 15:38:45.847087 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:45.847062 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-65cbt_928583e4-43ad-4abd-acc5-eb09b449e3b7/init-textfile/0.log" Apr 21 15:38:45.896480 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:45.896456 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:45.900356 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:45.900338 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-945fb46b9-n7wkp" Apr 21 15:38:46.046565 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:46.046537 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-65cbt_928583e4-43ad-4abd-acc5-eb09b449e3b7/node-exporter/0.log" Apr 21 15:38:46.246117 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:46.246018 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-65cbt_928583e4-43ad-4abd-acc5-eb09b449e3b7/kube-rbac-proxy/0.log" Apr 21 15:38:48.249782 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:48.249757 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9d57ff44-23c3-46a1-aeb8-61fb85c14a53/init-config-reloader/0.log" Apr 21 15:38:48.448397 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:48.448373 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9d57ff44-23c3-46a1-aeb8-61fb85c14a53/prometheus/0.log" Apr 21 15:38:48.646260 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:48.646225 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9d57ff44-23c3-46a1-aeb8-61fb85c14a53/config-reloader/0.log" Apr 21 15:38:48.846467 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:48.846446 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9d57ff44-23c3-46a1-aeb8-61fb85c14a53/thanos-sidecar/0.log" Apr 21 15:38:49.045634 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:49.045568 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9d57ff44-23c3-46a1-aeb8-61fb85c14a53/kube-rbac-proxy-web/0.log" Apr 21 15:38:49.246000 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:49.245933 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9d57ff44-23c3-46a1-aeb8-61fb85c14a53/kube-rbac-proxy/0.log" Apr 21 15:38:49.445821 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:49.445798 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9d57ff44-23c3-46a1-aeb8-61fb85c14a53/kube-rbac-proxy-thanos/0.log" Apr 21 15:38:50.247123 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:50.247093 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dddb456dc-cz6f5_de1019eb-eae5-4deb-bb4c-327d43db9d13/thanos-query/0.log" Apr 21 15:38:50.449594 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:50.449567 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dddb456dc-cz6f5_de1019eb-eae5-4deb-bb4c-327d43db9d13/kube-rbac-proxy-web/0.log" Apr 21 15:38:50.645105 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:50.645077 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dddb456dc-cz6f5_de1019eb-eae5-4deb-bb4c-327d43db9d13/kube-rbac-proxy/0.log" Apr 21 15:38:50.847780 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:50.847755 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dddb456dc-cz6f5_de1019eb-eae5-4deb-bb4c-327d43db9d13/prom-label-proxy/0.log" Apr 21 15:38:51.046207 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:51.046143 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dddb456dc-cz6f5_de1019eb-eae5-4deb-bb4c-327d43db9d13/kube-rbac-proxy-rules/0.log" Apr 21 15:38:51.245531 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:51.245507 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dddb456dc-cz6f5_de1019eb-eae5-4deb-bb4c-327d43db9d13/kube-rbac-proxy-metrics/0.log" Apr 21 15:38:51.645058 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:51.645032 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/2.log" Apr 21 15:38:51.846648 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:38:51.846615 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/3.log" Apr 21 15:39:07.620152 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:07.620123 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:07.640468 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:07.640442 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:08.378931 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:08.378902 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:20.403072 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:20.403046 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zf8zt_a1f66133-b065-4100-b368-ac1f349bf896/cluster-monitoring-operator/0.log" Apr 21 15:39:20.403379 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:20.403087 2579 generic.go:358] "Generic (PLEG): container finished" podID="a1f66133-b065-4100-b368-ac1f349bf896" containerID="6d2bcc08fbe202c121abe7a5430682b0f92c45da916c12d7bdaefdf19dc0db6d" exitCode=2 Apr 21 15:39:20.403379 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:20.403137 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" event={"ID":"a1f66133-b065-4100-b368-ac1f349bf896","Type":"ContainerDied","Data":"6d2bcc08fbe202c121abe7a5430682b0f92c45da916c12d7bdaefdf19dc0db6d"} Apr 21 15:39:20.403488 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:20.403470 2579 scope.go:117] "RemoveContainer" containerID="6d2bcc08fbe202c121abe7a5430682b0f92c45da916c12d7bdaefdf19dc0db6d" Apr 21 15:39:21.407184 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:21.407156 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zf8zt_a1f66133-b065-4100-b368-ac1f349bf896/cluster-monitoring-operator/0.log" Apr 21 15:39:21.407601 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:21.407207 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zf8zt" event={"ID":"a1f66133-b065-4100-b368-ac1f349bf896","Type":"ContainerStarted","Data":"e4b0bf0116141c0ecb690577399c6a91f3cff22ca5483524689897d501c4e85a"} Apr 21 15:39:24.984401 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:24.984362 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:39:24.984833 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:24.984787 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="alertmanager" containerID="cri-o://430f82fecb254eea7723aa14b42750d9154ec4de324a749946bec2799c06229d" gracePeriod=120 Apr 21 15:39:24.984901 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:24.984851 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="kube-rbac-proxy-metric" containerID="cri-o://80bc93bcbee8719ccc023daa27be2ca11ed54d7c776b49bb5f688b566ca3891c" gracePeriod=120 Apr 21 15:39:24.984989 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:24.984879 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="kube-rbac-proxy-web" containerID="cri-o://7675ac7bb0ff6a88087cfcd1c9535dd0b911531ea436946f158bac4f9a45fac2" gracePeriod=120 Apr 21 15:39:24.984989 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:24.984952 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="prom-label-proxy" containerID="cri-o://c8c4c9570870b39ae967bb79009c33acd27ea8aac8246a60471bed0bf7ff8065" gracePeriod=120 Apr 21 15:39:24.985096 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:24.984979 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="kube-rbac-proxy" containerID="cri-o://4c2f4d47e0d6cbff8042f7eada059ba786b11995d29c9eafd8f313b4c8f3849a" gracePeriod=120 Apr 21 15:39:24.985191 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:24.984917 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="config-reloader" containerID="cri-o://eeeb7e828470e1ff0da00e07767ec06f58dbe3f42f69eca89423a86db7a4b956" gracePeriod=120 Apr 21 15:39:25.422191 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:25.422159 2579 generic.go:358] "Generic (PLEG): container finished" podID="ca089d8e-69ec-4c46-aded-046febde7982" containerID="c8c4c9570870b39ae967bb79009c33acd27ea8aac8246a60471bed0bf7ff8065" exitCode=0 Apr 21 15:39:25.422191 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:25.422182 2579 generic.go:358] "Generic (PLEG): container finished" podID="ca089d8e-69ec-4c46-aded-046febde7982" containerID="80bc93bcbee8719ccc023daa27be2ca11ed54d7c776b49bb5f688b566ca3891c" exitCode=0 Apr 21 15:39:25.422191 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:25.422190 2579 generic.go:358] "Generic (PLEG): container finished" podID="ca089d8e-69ec-4c46-aded-046febde7982" containerID="4c2f4d47e0d6cbff8042f7eada059ba786b11995d29c9eafd8f313b4c8f3849a" exitCode=0 Apr 21 15:39:25.422191 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:25.422196 2579 generic.go:358] "Generic (PLEG): container finished" podID="ca089d8e-69ec-4c46-aded-046febde7982" containerID="eeeb7e828470e1ff0da00e07767ec06f58dbe3f42f69eca89423a86db7a4b956" exitCode=0 Apr 21 15:39:25.422191 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:25.422201 2579 generic.go:358] "Generic (PLEG): container finished" podID="ca089d8e-69ec-4c46-aded-046febde7982" containerID="430f82fecb254eea7723aa14b42750d9154ec4de324a749946bec2799c06229d" exitCode=0 Apr 21 15:39:25.422474 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:25.422230 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca089d8e-69ec-4c46-aded-046febde7982","Type":"ContainerDied","Data":"c8c4c9570870b39ae967bb79009c33acd27ea8aac8246a60471bed0bf7ff8065"} Apr 21 15:39:25.422474 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:25.422267 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca089d8e-69ec-4c46-aded-046febde7982","Type":"ContainerDied","Data":"80bc93bcbee8719ccc023daa27be2ca11ed54d7c776b49bb5f688b566ca3891c"} Apr 21 15:39:25.422474 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:25.422277 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca089d8e-69ec-4c46-aded-046febde7982","Type":"ContainerDied","Data":"4c2f4d47e0d6cbff8042f7eada059ba786b11995d29c9eafd8f313b4c8f3849a"} Apr 21 15:39:25.422474 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:25.422286 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca089d8e-69ec-4c46-aded-046febde7982","Type":"ContainerDied","Data":"eeeb7e828470e1ff0da00e07767ec06f58dbe3f42f69eca89423a86db7a4b956"} Apr 21 15:39:25.422474 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:25.422294 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca089d8e-69ec-4c46-aded-046febde7982","Type":"ContainerDied","Data":"430f82fecb254eea7723aa14b42750d9154ec4de324a749946bec2799c06229d"} Apr 21 15:39:26.222208 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.222187 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.288657 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.288634 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-web-config\") pod \"ca089d8e-69ec-4c46-aded-046febde7982\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " Apr 21 15:39:26.288776 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.288689 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nrhp\" (UniqueName: \"kubernetes.io/projected/ca089d8e-69ec-4c46-aded-046febde7982-kube-api-access-7nrhp\") pod \"ca089d8e-69ec-4c46-aded-046febde7982\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " Apr 21 15:39:26.288776 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.288708 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca089d8e-69ec-4c46-aded-046febde7982-alertmanager-trusted-ca-bundle\") pod \"ca089d8e-69ec-4c46-aded-046febde7982\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " Apr 21 15:39:26.288776 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.288729 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ca089d8e-69ec-4c46-aded-046febde7982-tls-assets\") pod \"ca089d8e-69ec-4c46-aded-046febde7982\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " Apr 21 15:39:26.288776 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.288754 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca089d8e-69ec-4c46-aded-046febde7982-metrics-client-ca\") pod \"ca089d8e-69ec-4c46-aded-046febde7982\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " Apr 21 15:39:26.288776 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.288771 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-main-tls\") pod \"ca089d8e-69ec-4c46-aded-046febde7982\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " Apr 21 15:39:26.289050 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.288796 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy\") pod \"ca089d8e-69ec-4c46-aded-046febde7982\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " Apr 21 15:39:26.289050 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.288825 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ca089d8e-69ec-4c46-aded-046febde7982-alertmanager-main-db\") pod \"ca089d8e-69ec-4c46-aded-046febde7982\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " Apr 21 15:39:26.289050 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.288855 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy-web\") pod \"ca089d8e-69ec-4c46-aded-046febde7982\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " Apr 21 15:39:26.289050 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.288887 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ca089d8e-69ec-4c46-aded-046febde7982-config-out\") pod \"ca089d8e-69ec-4c46-aded-046febde7982\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " Apr 21 15:39:26.289050 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.288925 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-cluster-tls-config\") pod \"ca089d8e-69ec-4c46-aded-046febde7982\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " Apr 21 15:39:26.289050 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.288986 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-config-volume\") pod \"ca089d8e-69ec-4c46-aded-046febde7982\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " Apr 21 15:39:26.289050 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.289032 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy-metric\") pod \"ca089d8e-69ec-4c46-aded-046febde7982\" (UID: \"ca089d8e-69ec-4c46-aded-046febde7982\") " Apr 21 15:39:26.289428 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.289176 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca089d8e-69ec-4c46-aded-046febde7982-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "ca089d8e-69ec-4c46-aded-046febde7982" (UID: "ca089d8e-69ec-4c46-aded-046febde7982"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:39:26.289428 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.289325 2579 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca089d8e-69ec-4c46-aded-046febde7982-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:26.290280 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.290049 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca089d8e-69ec-4c46-aded-046febde7982-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "ca089d8e-69ec-4c46-aded-046febde7982" (UID: "ca089d8e-69ec-4c46-aded-046febde7982"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:39:26.291038 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.290764 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca089d8e-69ec-4c46-aded-046febde7982-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "ca089d8e-69ec-4c46-aded-046febde7982" (UID: "ca089d8e-69ec-4c46-aded-046febde7982"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:39:26.292396 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.292349 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "ca089d8e-69ec-4c46-aded-046febde7982" (UID: "ca089d8e-69ec-4c46-aded-046febde7982"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:26.293376 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.293338 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca089d8e-69ec-4c46-aded-046febde7982-config-out" (OuterVolumeSpecName: "config-out") pod "ca089d8e-69ec-4c46-aded-046febde7982" (UID: "ca089d8e-69ec-4c46-aded-046febde7982"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:39:26.293688 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.293659 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "ca089d8e-69ec-4c46-aded-046febde7982" (UID: "ca089d8e-69ec-4c46-aded-046febde7982"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:26.293865 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.293834 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "ca089d8e-69ec-4c46-aded-046febde7982" (UID: "ca089d8e-69ec-4c46-aded-046febde7982"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:26.294628 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.294603 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca089d8e-69ec-4c46-aded-046febde7982-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ca089d8e-69ec-4c46-aded-046febde7982" (UID: "ca089d8e-69ec-4c46-aded-046febde7982"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:39:26.295329 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.295294 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-config-volume" (OuterVolumeSpecName: "config-volume") pod "ca089d8e-69ec-4c46-aded-046febde7982" (UID: "ca089d8e-69ec-4c46-aded-046febde7982"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:26.295579 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.295543 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "ca089d8e-69ec-4c46-aded-046febde7982" (UID: "ca089d8e-69ec-4c46-aded-046febde7982"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:26.295887 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.295862 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca089d8e-69ec-4c46-aded-046febde7982-kube-api-access-7nrhp" (OuterVolumeSpecName: "kube-api-access-7nrhp") pod "ca089d8e-69ec-4c46-aded-046febde7982" (UID: "ca089d8e-69ec-4c46-aded-046febde7982"). InnerVolumeSpecName "kube-api-access-7nrhp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:39:26.299261 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.299225 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "ca089d8e-69ec-4c46-aded-046febde7982" (UID: "ca089d8e-69ec-4c46-aded-046febde7982"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:26.305729 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.305706 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-web-config" (OuterVolumeSpecName: "web-config") pod "ca089d8e-69ec-4c46-aded-046febde7982" (UID: "ca089d8e-69ec-4c46-aded-046febde7982"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:26.390433 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.390362 2579 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ca089d8e-69ec-4c46-aded-046febde7982-alertmanager-main-db\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:26.390433 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.390389 2579 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:26.390433 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.390399 2579 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ca089d8e-69ec-4c46-aded-046febde7982-config-out\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:26.390433 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.390408 2579 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-cluster-tls-config\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:26.390433 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.390417 2579 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-config-volume\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:26.390433 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.390425 2579 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:26.390433 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.390433 2579 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-web-config\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:26.390703 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.390442 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7nrhp\" (UniqueName: \"kubernetes.io/projected/ca089d8e-69ec-4c46-aded-046febde7982-kube-api-access-7nrhp\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:26.390703 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.390449 2579 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ca089d8e-69ec-4c46-aded-046febde7982-tls-assets\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:26.390703 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.390457 2579 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca089d8e-69ec-4c46-aded-046febde7982-metrics-client-ca\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:26.390703 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.390465 2579 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-main-tls\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:26.390703 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.390474 2579 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ca089d8e-69ec-4c46-aded-046febde7982-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:26.428242 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.428212 2579 generic.go:358] "Generic (PLEG): container finished" podID="ca089d8e-69ec-4c46-aded-046febde7982" containerID="7675ac7bb0ff6a88087cfcd1c9535dd0b911531ea436946f158bac4f9a45fac2" exitCode=0 Apr 21 15:39:26.428360 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.428278 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca089d8e-69ec-4c46-aded-046febde7982","Type":"ContainerDied","Data":"7675ac7bb0ff6a88087cfcd1c9535dd0b911531ea436946f158bac4f9a45fac2"} Apr 21 15:39:26.428360 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.428313 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.428360 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.428325 2579 scope.go:117] "RemoveContainer" containerID="c8c4c9570870b39ae967bb79009c33acd27ea8aac8246a60471bed0bf7ff8065" Apr 21 15:39:26.428524 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.428313 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca089d8e-69ec-4c46-aded-046febde7982","Type":"ContainerDied","Data":"9e425eb9f257f90c427f4f20e8abf072db8deb3755e4ba1efb51cb5bfa7db13a"} Apr 21 15:39:26.435882 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.435857 2579 scope.go:117] "RemoveContainer" containerID="80bc93bcbee8719ccc023daa27be2ca11ed54d7c776b49bb5f688b566ca3891c" Apr 21 15:39:26.442477 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.442461 2579 scope.go:117] "RemoveContainer" containerID="4c2f4d47e0d6cbff8042f7eada059ba786b11995d29c9eafd8f313b4c8f3849a" Apr 21 15:39:26.448582 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.448569 2579 scope.go:117] "RemoveContainer" containerID="7675ac7bb0ff6a88087cfcd1c9535dd0b911531ea436946f158bac4f9a45fac2" Apr 21 15:39:26.453134 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.453113 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:39:26.455345 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.455326 2579 scope.go:117] "RemoveContainer" containerID="eeeb7e828470e1ff0da00e07767ec06f58dbe3f42f69eca89423a86db7a4b956" Apr 21 15:39:26.461131 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.461110 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:39:26.464114 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.464086 2579 scope.go:117] "RemoveContainer" containerID="430f82fecb254eea7723aa14b42750d9154ec4de324a749946bec2799c06229d" Apr 21 15:39:26.470655 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.470635 2579 scope.go:117] "RemoveContainer" containerID="cb4ab7cfb46cc17424ad0eafa23a14b33b1659be6a150255ee8e5cb244eda439" Apr 21 15:39:26.476691 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.476677 2579 scope.go:117] "RemoveContainer" containerID="c8c4c9570870b39ae967bb79009c33acd27ea8aac8246a60471bed0bf7ff8065" Apr 21 15:39:26.476931 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:39:26.476913 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8c4c9570870b39ae967bb79009c33acd27ea8aac8246a60471bed0bf7ff8065\": container with ID starting with c8c4c9570870b39ae967bb79009c33acd27ea8aac8246a60471bed0bf7ff8065 not found: ID does not exist" containerID="c8c4c9570870b39ae967bb79009c33acd27ea8aac8246a60471bed0bf7ff8065" Apr 21 15:39:26.477001 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.476955 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8c4c9570870b39ae967bb79009c33acd27ea8aac8246a60471bed0bf7ff8065"} err="failed to get container status \"c8c4c9570870b39ae967bb79009c33acd27ea8aac8246a60471bed0bf7ff8065\": rpc error: code = NotFound desc = could not find container \"c8c4c9570870b39ae967bb79009c33acd27ea8aac8246a60471bed0bf7ff8065\": container with ID starting with c8c4c9570870b39ae967bb79009c33acd27ea8aac8246a60471bed0bf7ff8065 not found: ID does not exist" Apr 21 15:39:26.477001 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.476976 2579 scope.go:117] "RemoveContainer" containerID="80bc93bcbee8719ccc023daa27be2ca11ed54d7c776b49bb5f688b566ca3891c" Apr 21 15:39:26.477203 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:39:26.477182 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80bc93bcbee8719ccc023daa27be2ca11ed54d7c776b49bb5f688b566ca3891c\": container with ID starting with 80bc93bcbee8719ccc023daa27be2ca11ed54d7c776b49bb5f688b566ca3891c not found: ID does not exist" containerID="80bc93bcbee8719ccc023daa27be2ca11ed54d7c776b49bb5f688b566ca3891c" Apr 21 15:39:26.477287 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.477208 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80bc93bcbee8719ccc023daa27be2ca11ed54d7c776b49bb5f688b566ca3891c"} err="failed to get container status \"80bc93bcbee8719ccc023daa27be2ca11ed54d7c776b49bb5f688b566ca3891c\": rpc error: code = NotFound desc = could not find container \"80bc93bcbee8719ccc023daa27be2ca11ed54d7c776b49bb5f688b566ca3891c\": container with ID starting with 80bc93bcbee8719ccc023daa27be2ca11ed54d7c776b49bb5f688b566ca3891c not found: ID does not exist" Apr 21 15:39:26.477287 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.477225 2579 scope.go:117] "RemoveContainer" containerID="4c2f4d47e0d6cbff8042f7eada059ba786b11995d29c9eafd8f313b4c8f3849a" Apr 21 15:39:26.477481 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:39:26.477452 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c2f4d47e0d6cbff8042f7eada059ba786b11995d29c9eafd8f313b4c8f3849a\": container with ID starting with 4c2f4d47e0d6cbff8042f7eada059ba786b11995d29c9eafd8f313b4c8f3849a not found: ID does not exist" containerID="4c2f4d47e0d6cbff8042f7eada059ba786b11995d29c9eafd8f313b4c8f3849a" Apr 21 15:39:26.477518 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.477486 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c2f4d47e0d6cbff8042f7eada059ba786b11995d29c9eafd8f313b4c8f3849a"} err="failed to get container status \"4c2f4d47e0d6cbff8042f7eada059ba786b11995d29c9eafd8f313b4c8f3849a\": rpc error: code = NotFound desc = could not find container \"4c2f4d47e0d6cbff8042f7eada059ba786b11995d29c9eafd8f313b4c8f3849a\": container with ID starting with 4c2f4d47e0d6cbff8042f7eada059ba786b11995d29c9eafd8f313b4c8f3849a not found: ID does not exist" Apr 21 15:39:26.477518 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.477500 2579 scope.go:117] "RemoveContainer" containerID="7675ac7bb0ff6a88087cfcd1c9535dd0b911531ea436946f158bac4f9a45fac2" Apr 21 15:39:26.477716 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:39:26.477699 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7675ac7bb0ff6a88087cfcd1c9535dd0b911531ea436946f158bac4f9a45fac2\": container with ID starting with 7675ac7bb0ff6a88087cfcd1c9535dd0b911531ea436946f158bac4f9a45fac2 not found: ID does not exist" containerID="7675ac7bb0ff6a88087cfcd1c9535dd0b911531ea436946f158bac4f9a45fac2" Apr 21 15:39:26.477760 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.477722 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7675ac7bb0ff6a88087cfcd1c9535dd0b911531ea436946f158bac4f9a45fac2"} err="failed to get container status \"7675ac7bb0ff6a88087cfcd1c9535dd0b911531ea436946f158bac4f9a45fac2\": rpc error: code = NotFound desc = could not find container \"7675ac7bb0ff6a88087cfcd1c9535dd0b911531ea436946f158bac4f9a45fac2\": container with ID starting with 7675ac7bb0ff6a88087cfcd1c9535dd0b911531ea436946f158bac4f9a45fac2 not found: ID does not exist" Apr 21 15:39:26.477760 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.477742 2579 scope.go:117] "RemoveContainer" containerID="eeeb7e828470e1ff0da00e07767ec06f58dbe3f42f69eca89423a86db7a4b956" Apr 21 15:39:26.477988 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:39:26.477970 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeeb7e828470e1ff0da00e07767ec06f58dbe3f42f69eca89423a86db7a4b956\": container with ID starting with eeeb7e828470e1ff0da00e07767ec06f58dbe3f42f69eca89423a86db7a4b956 not found: ID does not exist" containerID="eeeb7e828470e1ff0da00e07767ec06f58dbe3f42f69eca89423a86db7a4b956" Apr 21 15:39:26.478034 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.477995 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeeb7e828470e1ff0da00e07767ec06f58dbe3f42f69eca89423a86db7a4b956"} err="failed to get container status \"eeeb7e828470e1ff0da00e07767ec06f58dbe3f42f69eca89423a86db7a4b956\": rpc error: code = NotFound desc = could not find container \"eeeb7e828470e1ff0da00e07767ec06f58dbe3f42f69eca89423a86db7a4b956\": container with ID starting with eeeb7e828470e1ff0da00e07767ec06f58dbe3f42f69eca89423a86db7a4b956 not found: ID does not exist" Apr 21 15:39:26.478034 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.478009 2579 scope.go:117] "RemoveContainer" containerID="430f82fecb254eea7723aa14b42750d9154ec4de324a749946bec2799c06229d" Apr 21 15:39:26.478219 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:39:26.478204 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"430f82fecb254eea7723aa14b42750d9154ec4de324a749946bec2799c06229d\": container with ID starting with 430f82fecb254eea7723aa14b42750d9154ec4de324a749946bec2799c06229d not found: ID does not exist" containerID="430f82fecb254eea7723aa14b42750d9154ec4de324a749946bec2799c06229d" Apr 21 15:39:26.478253 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.478225 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"430f82fecb254eea7723aa14b42750d9154ec4de324a749946bec2799c06229d"} err="failed to get container status \"430f82fecb254eea7723aa14b42750d9154ec4de324a749946bec2799c06229d\": rpc error: code = NotFound desc = could not find container \"430f82fecb254eea7723aa14b42750d9154ec4de324a749946bec2799c06229d\": container with ID starting with 430f82fecb254eea7723aa14b42750d9154ec4de324a749946bec2799c06229d not found: ID does not exist" Apr 21 15:39:26.478253 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.478237 2579 scope.go:117] "RemoveContainer" containerID="cb4ab7cfb46cc17424ad0eafa23a14b33b1659be6a150255ee8e5cb244eda439" Apr 21 15:39:26.478427 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:39:26.478412 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb4ab7cfb46cc17424ad0eafa23a14b33b1659be6a150255ee8e5cb244eda439\": container with ID starting with cb4ab7cfb46cc17424ad0eafa23a14b33b1659be6a150255ee8e5cb244eda439 not found: ID does not exist" containerID="cb4ab7cfb46cc17424ad0eafa23a14b33b1659be6a150255ee8e5cb244eda439" Apr 21 15:39:26.478465 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.478431 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb4ab7cfb46cc17424ad0eafa23a14b33b1659be6a150255ee8e5cb244eda439"} err="failed to get container status \"cb4ab7cfb46cc17424ad0eafa23a14b33b1659be6a150255ee8e5cb244eda439\": rpc error: code = NotFound desc = could not find container \"cb4ab7cfb46cc17424ad0eafa23a14b33b1659be6a150255ee8e5cb244eda439\": container with ID starting with cb4ab7cfb46cc17424ad0eafa23a14b33b1659be6a150255ee8e5cb244eda439 not found: ID does not exist" Apr 21 15:39:26.495933 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.495907 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:39:26.496206 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496192 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="init-config-reloader" Apr 21 15:39:26.496250 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496207 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="init-config-reloader" Apr 21 15:39:26.496250 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496216 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="config-reloader" Apr 21 15:39:26.496250 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496222 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="config-reloader" Apr 21 15:39:26.496250 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496231 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="alertmanager" Apr 21 15:39:26.496250 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496237 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="alertmanager" Apr 21 15:39:26.496250 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496245 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="kube-rbac-proxy-web" Apr 21 15:39:26.496250 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496250 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="kube-rbac-proxy-web" Apr 21 15:39:26.496467 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496260 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cbb7c2f-db4c-45f0-886d-922983c0ce02" containerName="registry" Apr 21 15:39:26.496467 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496265 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cbb7c2f-db4c-45f0-886d-922983c0ce02" containerName="registry" Apr 21 15:39:26.496467 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496272 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="kube-rbac-proxy" Apr 21 15:39:26.496467 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496278 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="kube-rbac-proxy" Apr 21 15:39:26.496467 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496288 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="kube-rbac-proxy-metric" Apr 21 15:39:26.496467 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496292 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="kube-rbac-proxy-metric" Apr 21 15:39:26.496467 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496300 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="prom-label-proxy" Apr 21 15:39:26.496467 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496305 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="prom-label-proxy" Apr 21 15:39:26.496467 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496356 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="kube-rbac-proxy" Apr 21 15:39:26.496467 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496364 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="alertmanager" Apr 21 15:39:26.496467 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496371 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="prom-label-proxy" Apr 21 15:39:26.496467 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496397 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5cbb7c2f-db4c-45f0-886d-922983c0ce02" containerName="registry" Apr 21 15:39:26.496467 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496403 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="kube-rbac-proxy-web" Apr 21 15:39:26.496467 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496408 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="kube-rbac-proxy-metric" Apr 21 15:39:26.496467 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.496414 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca089d8e-69ec-4c46-aded-046febde7982" containerName="config-reloader" Apr 21 15:39:26.501365 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.501348 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.504934 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.504914 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 15:39:26.505043 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.505016 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 15:39:26.505043 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.505017 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 15:39:26.505751 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.505731 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 15:39:26.505825 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.505806 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 15:39:26.506120 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.506103 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 15:39:26.508155 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.508107 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 15:39:26.508254 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.508167 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 15:39:26.508254 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.508226 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-c5x4r\"" Apr 21 15:39:26.522972 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.512442 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 15:39:26.529152 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.529112 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:39:26.592976 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.592926 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgf7g\" (UniqueName: \"kubernetes.io/projected/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-kube-api-access-cgf7g\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.593109 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.592981 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-config-volume\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.593109 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.593000 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.593109 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.593030 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.593109 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.593058 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.593109 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.593075 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-config-out\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.593109 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.593089 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.593292 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.593110 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.593292 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.593140 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.593292 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.593163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-web-config\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.593292 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.593196 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.593292 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.593219 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.593292 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.593232 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.694086 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.694011 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-web-config\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.694086 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.694057 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.694086 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.694078 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.694288 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.694101 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.694288 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.694136 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cgf7g\" (UniqueName: \"kubernetes.io/projected/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-kube-api-access-cgf7g\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.694288 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.694166 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-config-volume\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.694288 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.694195 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.694288 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.694227 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.694288 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.694253 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.694288 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.694277 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-config-out\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.694587 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.694298 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.694587 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.694321 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.694587 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.694363 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.695178 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.695116 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.695178 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.695192 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.695463 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.695441 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.697585 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.697400 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-web-config\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.697585 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.697519 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-config-volume\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.697585 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.697538 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.697585 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.697538 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.697812 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.697644 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.697812 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.697760 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.697812 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.697808 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.697916 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.697899 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-config-out\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.699393 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.699377 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.706805 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.706785 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgf7g\" (UniqueName: \"kubernetes.io/projected/4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc-kube-api-access-cgf7g\") pod \"alertmanager-main-0\" (UID: \"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.813975 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.813955 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:39:26.949856 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:26.949809 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:39:26.951957 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:39:26.951917 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dd5aa1d_2fcd_4abc_a424_30e1fe11d4cc.slice/crio-5efb66ccc9b76ad8ee88f20eb78c79c53e2bf887db818e320330b5a047d36383 WatchSource:0}: Error finding container 5efb66ccc9b76ad8ee88f20eb78c79c53e2bf887db818e320330b5a047d36383: Status 404 returned error can't find the container with id 5efb66ccc9b76ad8ee88f20eb78c79c53e2bf887db818e320330b5a047d36383 Apr 21 15:39:27.433501 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:27.433472 2579 generic.go:358] "Generic (PLEG): container finished" podID="4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc" containerID="307943d75a92c2eb874b95e6eb04ad969fa87a02bb569a78ea4623f0fa4e5794" exitCode=0 Apr 21 15:39:27.433923 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:27.433565 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc","Type":"ContainerDied","Data":"307943d75a92c2eb874b95e6eb04ad969fa87a02bb569a78ea4623f0fa4e5794"} Apr 21 15:39:27.433923 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:27.433600 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc","Type":"ContainerStarted","Data":"5efb66ccc9b76ad8ee88f20eb78c79c53e2bf887db818e320330b5a047d36383"} Apr 21 15:39:27.608798 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:27.608720 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca089d8e-69ec-4c46-aded-046febde7982" path="/var/lib/kubelet/pods/ca089d8e-69ec-4c46-aded-046febde7982/volumes" Apr 21 15:39:28.440675 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:28.440644 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc","Type":"ContainerStarted","Data":"59d919fbf82f747dd7832da3267acb3fae95bab461052f96a76a6a2326658882"} Apr 21 15:39:28.440675 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:28.440677 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc","Type":"ContainerStarted","Data":"a3fc7d317b04cd2fb29347639744008ab11ba7fe1ad77654571eca84b8eae839"} Apr 21 15:39:28.440675 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:28.440687 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc","Type":"ContainerStarted","Data":"1ee4e3eb3d3e82ecc3ace0b70807f826b5cdb345f0841a7f457f2bff375f1f97"} Apr 21 15:39:28.441122 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:28.440701 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc","Type":"ContainerStarted","Data":"c3f5c10c91aad1a8e24503d907d09b331007c04c43c2d96a66d95e7705a8d214"} Apr 21 15:39:28.441122 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:28.440712 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc","Type":"ContainerStarted","Data":"0f4848be5b50d7f82720635981091f6f9ea0de64303a73fd67937666d30f0eff"} Apr 21 15:39:28.441122 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:28.440721 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc","Type":"ContainerStarted","Data":"04a428ae8594a8573f46ad5e75f100f91af64afa123a7f4e73734fbf955988cf"} Apr 21 15:39:28.487497 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:28.487447 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.4874306170000002 podStartE2EDuration="2.487430617s" podCreationTimestamp="2026-04-21 15:39:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:39:28.479516373 +0000 UTC m=+233.507683167" watchObservedRunningTime="2026-04-21 15:39:28.487430617 +0000 UTC m=+233.515597414" Apr 21 15:39:29.048057 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.048018 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq"] Apr 21 15:39:29.051459 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.051439 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.058786 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.058766 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-xp7nk\"" Apr 21 15:39:29.058786 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.058780 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 15:39:29.059007 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.058891 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 15:39:29.059007 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.058910 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 15:39:29.059007 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.058969 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 15:39:29.059263 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.059241 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 15:39:29.068973 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.068478 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 15:39:29.071472 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.071447 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq"] Apr 21 15:39:29.115842 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.115808 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92ad2768-ab1b-4047-9499-92f73e6e7306-metrics-client-ca\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.115981 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.115850 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92ad2768-ab1b-4047-9499-92f73e6e7306-telemeter-trusted-ca-bundle\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.115981 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.115876 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/92ad2768-ab1b-4047-9499-92f73e6e7306-federate-client-tls\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.115981 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.115971 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/92ad2768-ab1b-4047-9499-92f73e6e7306-secret-telemeter-client\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.116086 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.116001 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92ad2768-ab1b-4047-9499-92f73e6e7306-serving-certs-ca-bundle\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.116086 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.116037 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/92ad2768-ab1b-4047-9499-92f73e6e7306-telemeter-client-tls\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.116086 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.116056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5867\" (UniqueName: \"kubernetes.io/projected/92ad2768-ab1b-4047-9499-92f73e6e7306-kube-api-access-v5867\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.116168 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.116096 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/92ad2768-ab1b-4047-9499-92f73e6e7306-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.217220 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.217197 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/92ad2768-ab1b-4047-9499-92f73e6e7306-secret-telemeter-client\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.217351 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.217237 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92ad2768-ab1b-4047-9499-92f73e6e7306-serving-certs-ca-bundle\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.217351 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.217270 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/92ad2768-ab1b-4047-9499-92f73e6e7306-telemeter-client-tls\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.217351 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.217296 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5867\" (UniqueName: \"kubernetes.io/projected/92ad2768-ab1b-4047-9499-92f73e6e7306-kube-api-access-v5867\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.217486 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.217434 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/92ad2768-ab1b-4047-9499-92f73e6e7306-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.217522 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.217503 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92ad2768-ab1b-4047-9499-92f73e6e7306-metrics-client-ca\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.217567 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.217546 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92ad2768-ab1b-4047-9499-92f73e6e7306-telemeter-trusted-ca-bundle\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.217617 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.217587 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/92ad2768-ab1b-4047-9499-92f73e6e7306-federate-client-tls\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.218092 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.218063 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92ad2768-ab1b-4047-9499-92f73e6e7306-serving-certs-ca-bundle\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.218235 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.218215 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92ad2768-ab1b-4047-9499-92f73e6e7306-metrics-client-ca\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.218454 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.218306 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92ad2768-ab1b-4047-9499-92f73e6e7306-telemeter-trusted-ca-bundle\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.219747 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.219719 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/92ad2768-ab1b-4047-9499-92f73e6e7306-secret-telemeter-client\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.220030 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.220011 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/92ad2768-ab1b-4047-9499-92f73e6e7306-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.220345 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.220315 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/92ad2768-ab1b-4047-9499-92f73e6e7306-telemeter-client-tls\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.220430 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.220350 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/92ad2768-ab1b-4047-9499-92f73e6e7306-federate-client-tls\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.229098 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.229069 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5867\" (UniqueName: \"kubernetes.io/projected/92ad2768-ab1b-4047-9499-92f73e6e7306-kube-api-access-v5867\") pod \"telemeter-client-64cc55cd6f-7hkkq\" (UID: \"92ad2768-ab1b-4047-9499-92f73e6e7306\") " pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.355951 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.355913 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:39:29.356366 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.356323 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="prometheus" containerID="cri-o://373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2" gracePeriod=600 Apr 21 15:39:29.356498 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.356333 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="kube-rbac-proxy" containerID="cri-o://db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc" gracePeriod=600 Apr 21 15:39:29.356498 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.356390 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="kube-rbac-proxy-web" containerID="cri-o://3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91" gracePeriod=600 Apr 21 15:39:29.356498 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.356437 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="kube-rbac-proxy-thanos" containerID="cri-o://d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a" gracePeriod=600 Apr 21 15:39:29.356498 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.356409 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="config-reloader" containerID="cri-o://de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32" gracePeriod=600 Apr 21 15:39:29.356498 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.356352 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="thanos-sidecar" containerID="cri-o://85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411" gracePeriod=600 Apr 21 15:39:29.361058 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.361033 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" Apr 21 15:39:29.499244 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.499220 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq"] Apr 21 15:39:29.500485 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:39:29.500451 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92ad2768_ab1b_4047_9499_92f73e6e7306.slice/crio-19d2cf8894ad060d0e0fda1106799aff507912a937f8a7b52ff1860bbfef85d1 WatchSource:0}: Error finding container 19d2cf8894ad060d0e0fda1106799aff507912a937f8a7b52ff1860bbfef85d1: Status 404 returned error can't find the container with id 19d2cf8894ad060d0e0fda1106799aff507912a937f8a7b52ff1860bbfef85d1 Apr 21 15:39:29.586691 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.586670 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:29.722631 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.722560 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-web-config\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.722631 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.722602 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-tls\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.722631 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.722622 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trzfq\" (UniqueName: \"kubernetes.io/projected/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-kube-api-access-trzfq\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.722856 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.722651 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-grpc-tls\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.722856 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.722675 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-k8s-rulefiles-0\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.722856 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.722796 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.723031 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.722861 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-config\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.723031 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.722913 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-kube-rbac-proxy\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.723031 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.722982 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-tls-assets\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.723031 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.723008 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-trusted-ca-bundle\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.723212 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.723035 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-config-out\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.723212 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.723060 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.723212 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.723110 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-thanos-prometheus-http-client-file\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.723212 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.723149 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-serving-certs-ca-bundle\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.723212 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.723178 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-metrics-client-ca\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.723471 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.723212 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-metrics-client-certs\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.723471 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.723245 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-k8s-db\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.723471 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.723276 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-kubelet-serving-ca-bundle\") pod \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\" (UID: \"9d57ff44-23c3-46a1-aeb8-61fb85c14a53\") " Apr 21 15:39:29.723987 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.723936 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:39:29.724152 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.724054 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:39:29.725903 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.724983 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:39:29.725903 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.725297 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:39:29.725903 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.725729 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:39:29.725903 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.725828 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:29.726216 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.725991 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:29.726424 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.726383 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:39:29.726424 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.726381 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-kube-api-access-trzfq" (OuterVolumeSpecName: "kube-api-access-trzfq") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "kube-api-access-trzfq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:39:29.726809 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.726784 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:29.727258 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.727220 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:39:29.727258 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.727220 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-config" (OuterVolumeSpecName: "config") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:29.727401 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.727378 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:29.727958 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.727918 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:29.728263 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.728236 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:29.728328 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.728290 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-config-out" (OuterVolumeSpecName: "config-out") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:39:29.728664 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.728644 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:29.736676 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.736656 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-web-config" (OuterVolumeSpecName: "web-config") pod "9d57ff44-23c3-46a1-aeb8-61fb85c14a53" (UID: "9d57ff44-23c3-46a1-aeb8-61fb85c14a53"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:29.824187 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824163 2579 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-tls-assets\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:29.824187 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824187 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-trusted-ca-bundle\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:29.824334 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824202 2579 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-config-out\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:29.824334 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824214 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:29.824334 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824227 2579 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-thanos-prometheus-http-client-file\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:29.824334 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824240 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:29.824334 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824256 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-metrics-client-ca\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:29.824334 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824269 2579 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-metrics-client-certs\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:29.824334 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824282 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-k8s-db\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:29.824334 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824296 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:29.824334 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824311 2579 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-web-config\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:29.824334 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824325 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-tls\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:29.824334 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824344 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-trzfq\" (UniqueName: \"kubernetes.io/projected/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-kube-api-access-trzfq\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:29.824702 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824358 2579 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-grpc-tls\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:29.824702 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824371 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:29.824702 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824403 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:29.824702 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824417 2579 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-config\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:29.824702 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:29.824430 2579 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9d57ff44-23c3-46a1-aeb8-61fb85c14a53-secret-kube-rbac-proxy\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:39:30.447908 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.447870 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" event={"ID":"92ad2768-ab1b-4047-9499-92f73e6e7306","Type":"ContainerStarted","Data":"19d2cf8894ad060d0e0fda1106799aff507912a937f8a7b52ff1860bbfef85d1"} Apr 21 15:39:30.451283 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.451254 2579 generic.go:358] "Generic (PLEG): container finished" podID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerID="d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a" exitCode=0 Apr 21 15:39:30.451283 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.451279 2579 generic.go:358] "Generic (PLEG): container finished" podID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerID="db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc" exitCode=0 Apr 21 15:39:30.451509 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.451289 2579 generic.go:358] "Generic (PLEG): container finished" podID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerID="3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91" exitCode=0 Apr 21 15:39:30.451509 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.451295 2579 generic.go:358] "Generic (PLEG): container finished" podID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerID="85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411" exitCode=0 Apr 21 15:39:30.451509 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.451300 2579 generic.go:358] "Generic (PLEG): container finished" podID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerID="de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32" exitCode=0 Apr 21 15:39:30.451509 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.451308 2579 generic.go:358] "Generic (PLEG): container finished" podID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerID="373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2" exitCode=0 Apr 21 15:39:30.451509 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.451333 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9d57ff44-23c3-46a1-aeb8-61fb85c14a53","Type":"ContainerDied","Data":"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a"} Apr 21 15:39:30.451509 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.451360 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9d57ff44-23c3-46a1-aeb8-61fb85c14a53","Type":"ContainerDied","Data":"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc"} Apr 21 15:39:30.451509 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.451374 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9d57ff44-23c3-46a1-aeb8-61fb85c14a53","Type":"ContainerDied","Data":"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91"} Apr 21 15:39:30.451509 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.451376 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.451509 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.451386 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9d57ff44-23c3-46a1-aeb8-61fb85c14a53","Type":"ContainerDied","Data":"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411"} Apr 21 15:39:30.451509 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.451400 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9d57ff44-23c3-46a1-aeb8-61fb85c14a53","Type":"ContainerDied","Data":"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32"} Apr 21 15:39:30.451509 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.451409 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9d57ff44-23c3-46a1-aeb8-61fb85c14a53","Type":"ContainerDied","Data":"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2"} Apr 21 15:39:30.451509 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.451418 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9d57ff44-23c3-46a1-aeb8-61fb85c14a53","Type":"ContainerDied","Data":"75076d41df3c11a6a98e2855daa28df13e165cc054b747d097694c8dff2f2323"} Apr 21 15:39:30.451509 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.451432 2579 scope.go:117] "RemoveContainer" containerID="d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a" Apr 21 15:39:30.461148 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.461122 2579 scope.go:117] "RemoveContainer" containerID="db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc" Apr 21 15:39:30.469503 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.469485 2579 scope.go:117] "RemoveContainer" containerID="3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91" Apr 21 15:39:30.476612 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.476590 2579 scope.go:117] "RemoveContainer" containerID="85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411" Apr 21 15:39:30.480522 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.480495 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:39:30.486277 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.486254 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:39:30.486599 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.486582 2579 scope.go:117] "RemoveContainer" containerID="de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32" Apr 21 15:39:30.494083 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.494067 2579 scope.go:117] "RemoveContainer" containerID="373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2" Apr 21 15:39:30.500932 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.500918 2579 scope.go:117] "RemoveContainer" containerID="070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6" Apr 21 15:39:30.506960 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.506930 2579 scope.go:117] "RemoveContainer" containerID="d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a" Apr 21 15:39:30.507192 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:39:30.507175 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a\": container with ID starting with d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a not found: ID does not exist" containerID="d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a" Apr 21 15:39:30.507238 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.507200 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a"} err="failed to get container status \"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a\": rpc error: code = NotFound desc = could not find container \"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a\": container with ID starting with d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a not found: ID does not exist" Apr 21 15:39:30.507238 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.507218 2579 scope.go:117] "RemoveContainer" containerID="db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc" Apr 21 15:39:30.507460 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:39:30.507441 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc\": container with ID starting with db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc not found: ID does not exist" containerID="db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc" Apr 21 15:39:30.507505 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.507467 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc"} err="failed to get container status \"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc\": rpc error: code = NotFound desc = could not find container \"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc\": container with ID starting with db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc not found: ID does not exist" Apr 21 15:39:30.507505 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.507485 2579 scope.go:117] "RemoveContainer" containerID="3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91" Apr 21 15:39:30.507699 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:39:30.507681 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91\": container with ID starting with 3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91 not found: ID does not exist" containerID="3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91" Apr 21 15:39:30.507770 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.507708 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91"} err="failed to get container status \"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91\": rpc error: code = NotFound desc = could not find container \"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91\": container with ID starting with 3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91 not found: ID does not exist" Apr 21 15:39:30.507770 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.507731 2579 scope.go:117] "RemoveContainer" containerID="85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411" Apr 21 15:39:30.507997 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:39:30.507981 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411\": container with ID starting with 85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411 not found: ID does not exist" containerID="85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411" Apr 21 15:39:30.508037 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.508002 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411"} err="failed to get container status \"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411\": rpc error: code = NotFound desc = could not find container \"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411\": container with ID starting with 85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411 not found: ID does not exist" Apr 21 15:39:30.508037 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.508015 2579 scope.go:117] "RemoveContainer" containerID="de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32" Apr 21 15:39:30.508230 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:39:30.508215 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32\": container with ID starting with de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32 not found: ID does not exist" containerID="de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32" Apr 21 15:39:30.508288 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.508239 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32"} err="failed to get container status \"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32\": rpc error: code = NotFound desc = could not find container \"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32\": container with ID starting with de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32 not found: ID does not exist" Apr 21 15:39:30.508288 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.508259 2579 scope.go:117] "RemoveContainer" containerID="373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2" Apr 21 15:39:30.508540 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:39:30.508516 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2\": container with ID starting with 373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2 not found: ID does not exist" containerID="373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2" Apr 21 15:39:30.508617 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.508545 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2"} err="failed to get container status \"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2\": rpc error: code = NotFound desc = could not find container \"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2\": container with ID starting with 373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2 not found: ID does not exist" Apr 21 15:39:30.508617 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.508564 2579 scope.go:117] "RemoveContainer" containerID="070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6" Apr 21 15:39:30.508785 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:39:30.508769 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6\": container with ID starting with 070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6 not found: ID does not exist" containerID="070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6" Apr 21 15:39:30.508830 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.508789 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6"} err="failed to get container status \"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6\": rpc error: code = NotFound desc = could not find container \"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6\": container with ID starting with 070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6 not found: ID does not exist" Apr 21 15:39:30.508830 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.508811 2579 scope.go:117] "RemoveContainer" containerID="d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a" Apr 21 15:39:30.509084 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.509057 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a"} err="failed to get container status \"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a\": rpc error: code = NotFound desc = could not find container \"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a\": container with ID starting with d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a not found: ID does not exist" Apr 21 15:39:30.509184 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.509090 2579 scope.go:117] "RemoveContainer" containerID="db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc" Apr 21 15:39:30.509363 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.509340 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc"} err="failed to get container status \"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc\": rpc error: code = NotFound desc = could not find container \"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc\": container with ID starting with db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc not found: ID does not exist" Apr 21 15:39:30.509443 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.509364 2579 scope.go:117] "RemoveContainer" containerID="3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91" Apr 21 15:39:30.509681 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.509655 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91"} err="failed to get container status \"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91\": rpc error: code = NotFound desc = could not find container \"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91\": container with ID starting with 3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91 not found: ID does not exist" Apr 21 15:39:30.509721 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.509683 2579 scope.go:117] "RemoveContainer" containerID="85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411" Apr 21 15:39:30.509918 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.509901 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411"} err="failed to get container status \"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411\": rpc error: code = NotFound desc = could not find container \"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411\": container with ID starting with 85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411 not found: ID does not exist" Apr 21 15:39:30.509983 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.509919 2579 scope.go:117] "RemoveContainer" containerID="de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32" Apr 21 15:39:30.510139 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.510120 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32"} err="failed to get container status \"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32\": rpc error: code = NotFound desc = could not find container \"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32\": container with ID starting with de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32 not found: ID does not exist" Apr 21 15:39:30.510183 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.510142 2579 scope.go:117] "RemoveContainer" containerID="373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2" Apr 21 15:39:30.510344 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.510316 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2"} err="failed to get container status \"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2\": rpc error: code = NotFound desc = could not find container \"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2\": container with ID starting with 373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2 not found: ID does not exist" Apr 21 15:39:30.510381 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.510346 2579 scope.go:117] "RemoveContainer" containerID="070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6" Apr 21 15:39:30.510525 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.510510 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6"} err="failed to get container status \"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6\": rpc error: code = NotFound desc = could not find container \"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6\": container with ID starting with 070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6 not found: ID does not exist" Apr 21 15:39:30.510569 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.510526 2579 scope.go:117] "RemoveContainer" containerID="d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a" Apr 21 15:39:30.510735 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.510719 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a"} err="failed to get container status \"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a\": rpc error: code = NotFound desc = could not find container \"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a\": container with ID starting with d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a not found: ID does not exist" Apr 21 15:39:30.510774 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.510735 2579 scope.go:117] "RemoveContainer" containerID="db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc" Apr 21 15:39:30.510910 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.510896 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc"} err="failed to get container status \"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc\": rpc error: code = NotFound desc = could not find container \"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc\": container with ID starting with db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc not found: ID does not exist" Apr 21 15:39:30.510969 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.510909 2579 scope.go:117] "RemoveContainer" containerID="3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91" Apr 21 15:39:30.511103 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.511089 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91"} err="failed to get container status \"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91\": rpc error: code = NotFound desc = could not find container \"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91\": container with ID starting with 3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91 not found: ID does not exist" Apr 21 15:39:30.511153 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.511104 2579 scope.go:117] "RemoveContainer" containerID="85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411" Apr 21 15:39:30.511306 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.511291 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411"} err="failed to get container status \"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411\": rpc error: code = NotFound desc = could not find container \"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411\": container with ID starting with 85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411 not found: ID does not exist" Apr 21 15:39:30.511306 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.511305 2579 scope.go:117] "RemoveContainer" containerID="de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32" Apr 21 15:39:30.511471 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.511455 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32"} err="failed to get container status \"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32\": rpc error: code = NotFound desc = could not find container \"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32\": container with ID starting with de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32 not found: ID does not exist" Apr 21 15:39:30.511507 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.511471 2579 scope.go:117] "RemoveContainer" containerID="373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2" Apr 21 15:39:30.511647 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.511633 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2"} err="failed to get container status \"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2\": rpc error: code = NotFound desc = could not find container \"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2\": container with ID starting with 373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2 not found: ID does not exist" Apr 21 15:39:30.511693 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.511647 2579 scope.go:117] "RemoveContainer" containerID="070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6" Apr 21 15:39:30.511786 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.511772 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6"} err="failed to get container status \"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6\": rpc error: code = NotFound desc = could not find container \"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6\": container with ID starting with 070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6 not found: ID does not exist" Apr 21 15:39:30.511829 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.511787 2579 scope.go:117] "RemoveContainer" containerID="d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a" Apr 21 15:39:30.512025 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.512006 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a"} err="failed to get container status \"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a\": rpc error: code = NotFound desc = could not find container \"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a\": container with ID starting with d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a not found: ID does not exist" Apr 21 15:39:30.512025 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.512025 2579 scope.go:117] "RemoveContainer" containerID="db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc" Apr 21 15:39:30.512210 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.512197 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc"} err="failed to get container status \"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc\": rpc error: code = NotFound desc = could not find container \"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc\": container with ID starting with db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc not found: ID does not exist" Apr 21 15:39:30.512252 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.512212 2579 scope.go:117] "RemoveContainer" containerID="3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91" Apr 21 15:39:30.512413 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.512366 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91"} err="failed to get container status \"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91\": rpc error: code = NotFound desc = could not find container \"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91\": container with ID starting with 3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91 not found: ID does not exist" Apr 21 15:39:30.512413 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.512410 2579 scope.go:117] "RemoveContainer" containerID="85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411" Apr 21 15:39:30.512579 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.512560 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411"} err="failed to get container status \"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411\": rpc error: code = NotFound desc = could not find container \"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411\": container with ID starting with 85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411 not found: ID does not exist" Apr 21 15:39:30.512623 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.512579 2579 scope.go:117] "RemoveContainer" containerID="de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32" Apr 21 15:39:30.512776 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.512759 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32"} err="failed to get container status \"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32\": rpc error: code = NotFound desc = could not find container \"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32\": container with ID starting with de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32 not found: ID does not exist" Apr 21 15:39:30.512828 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.512777 2579 scope.go:117] "RemoveContainer" containerID="373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2" Apr 21 15:39:30.512975 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.512961 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2"} err="failed to get container status \"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2\": rpc error: code = NotFound desc = could not find container \"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2\": container with ID starting with 373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2 not found: ID does not exist" Apr 21 15:39:30.513014 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.512975 2579 scope.go:117] "RemoveContainer" containerID="070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6" Apr 21 15:39:30.513153 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.513134 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6"} err="failed to get container status \"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6\": rpc error: code = NotFound desc = could not find container \"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6\": container with ID starting with 070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6 not found: ID does not exist" Apr 21 15:39:30.513153 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.513151 2579 scope.go:117] "RemoveContainer" containerID="d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a" Apr 21 15:39:30.513377 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.513355 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a"} err="failed to get container status \"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a\": rpc error: code = NotFound desc = could not find container \"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a\": container with ID starting with d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a not found: ID does not exist" Apr 21 15:39:30.513424 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.513379 2579 scope.go:117] "RemoveContainer" containerID="db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc" Apr 21 15:39:30.513607 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.513587 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc"} err="failed to get container status \"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc\": rpc error: code = NotFound desc = could not find container \"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc\": container with ID starting with db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc not found: ID does not exist" Apr 21 15:39:30.513657 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.513608 2579 scope.go:117] "RemoveContainer" containerID="3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91" Apr 21 15:39:30.513762 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.513748 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91"} err="failed to get container status \"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91\": rpc error: code = NotFound desc = could not find container \"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91\": container with ID starting with 3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91 not found: ID does not exist" Apr 21 15:39:30.513819 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.513762 2579 scope.go:117] "RemoveContainer" containerID="85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411" Apr 21 15:39:30.513994 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.513977 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411"} err="failed to get container status \"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411\": rpc error: code = NotFound desc = could not find container \"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411\": container with ID starting with 85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411 not found: ID does not exist" Apr 21 15:39:30.514041 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.513994 2579 scope.go:117] "RemoveContainer" containerID="de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32" Apr 21 15:39:30.514160 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.514147 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32"} err="failed to get container status \"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32\": rpc error: code = NotFound desc = could not find container \"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32\": container with ID starting with de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32 not found: ID does not exist" Apr 21 15:39:30.514196 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.514160 2579 scope.go:117] "RemoveContainer" containerID="373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2" Apr 21 15:39:30.514358 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.514337 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2"} err="failed to get container status \"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2\": rpc error: code = NotFound desc = could not find container \"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2\": container with ID starting with 373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2 not found: ID does not exist" Apr 21 15:39:30.514430 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.514360 2579 scope.go:117] "RemoveContainer" containerID="070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6" Apr 21 15:39:30.514568 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.514552 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6"} err="failed to get container status \"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6\": rpc error: code = NotFound desc = could not find container \"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6\": container with ID starting with 070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6 not found: ID does not exist" Apr 21 15:39:30.514626 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.514568 2579 scope.go:117] "RemoveContainer" containerID="d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a" Apr 21 15:39:30.514763 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.514747 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a"} err="failed to get container status \"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a\": rpc error: code = NotFound desc = could not find container \"d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a\": container with ID starting with d9e6198151d415034fe40d772eef1a864d0b5caa2531c0c3dd7b76966507069a not found: ID does not exist" Apr 21 15:39:30.514805 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.514763 2579 scope.go:117] "RemoveContainer" containerID="db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc" Apr 21 15:39:30.514970 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.514932 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc"} err="failed to get container status \"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc\": rpc error: code = NotFound desc = could not find container \"db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc\": container with ID starting with db080278b80d9dad05515ef0a4b55978b89334561525018810676049d990efcc not found: ID does not exist" Apr 21 15:39:30.515009 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.514971 2579 scope.go:117] "RemoveContainer" containerID="3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91" Apr 21 15:39:30.515155 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.515138 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91"} err="failed to get container status \"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91\": rpc error: code = NotFound desc = could not find container \"3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91\": container with ID starting with 3bc074cb3402f07fbc0d8b7b6e8c9dce4e930f94cdacb63ee2b93aeacf593f91 not found: ID does not exist" Apr 21 15:39:30.515192 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.515155 2579 scope.go:117] "RemoveContainer" containerID="85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411" Apr 21 15:39:30.515312 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.515298 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411"} err="failed to get container status \"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411\": rpc error: code = NotFound desc = could not find container \"85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411\": container with ID starting with 85f084e01df7e7293aa7d1674e3a5701cc2d5d7b64d7fe41f6b4fa9ee4483411 not found: ID does not exist" Apr 21 15:39:30.515347 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.515314 2579 scope.go:117] "RemoveContainer" containerID="de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32" Apr 21 15:39:30.515500 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.515477 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32"} err="failed to get container status \"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32\": rpc error: code = NotFound desc = could not find container \"de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32\": container with ID starting with de6b9486b906352b76275467831c6b6172c9748e041cff9e3aab61021b98bb32 not found: ID does not exist" Apr 21 15:39:30.515500 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.515499 2579 scope.go:117] "RemoveContainer" containerID="373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2" Apr 21 15:39:30.515708 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.515692 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2"} err="failed to get container status \"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2\": rpc error: code = NotFound desc = could not find container \"373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2\": container with ID starting with 373f3ae990960c804054612fdad0944d12d43752ee8d0b08de5eaedf9418dcc2 not found: ID does not exist" Apr 21 15:39:30.515758 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.515711 2579 scope.go:117] "RemoveContainer" containerID="070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6" Apr 21 15:39:30.515956 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.515922 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6"} err="failed to get container status \"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6\": rpc error: code = NotFound desc = could not find container \"070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6\": container with ID starting with 070a484d5c09ad1f8b5a94c82b6fa8d166cc13b7d59c459f5d05a5a5f6cc83d6 not found: ID does not exist" Apr 21 15:39:30.527118 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527099 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:39:30.527388 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527377 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="thanos-sidecar" Apr 21 15:39:30.527429 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527390 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="thanos-sidecar" Apr 21 15:39:30.527429 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527398 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="config-reloader" Apr 21 15:39:30.527429 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527403 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="config-reloader" Apr 21 15:39:30.527429 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527419 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="kube-rbac-proxy-web" Apr 21 15:39:30.527429 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527424 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="kube-rbac-proxy-web" Apr 21 15:39:30.527560 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527432 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="prometheus" Apr 21 15:39:30.527560 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527438 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="prometheus" Apr 21 15:39:30.527560 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527443 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="kube-rbac-proxy-thanos" Apr 21 15:39:30.527560 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527448 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="kube-rbac-proxy-thanos" Apr 21 15:39:30.527560 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527457 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="init-config-reloader" Apr 21 15:39:30.527560 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527461 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="init-config-reloader" Apr 21 15:39:30.527560 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527466 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="kube-rbac-proxy" Apr 21 15:39:30.527560 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527471 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="kube-rbac-proxy" Apr 21 15:39:30.527560 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527513 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="config-reloader" Apr 21 15:39:30.527560 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527520 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="kube-rbac-proxy-thanos" Apr 21 15:39:30.527560 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527529 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="prometheus" Apr 21 15:39:30.527560 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527535 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="kube-rbac-proxy" Apr 21 15:39:30.527560 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527542 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="kube-rbac-proxy-web" Apr 21 15:39:30.527560 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.527548 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" containerName="thanos-sidecar" Apr 21 15:39:30.532567 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.532551 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.538081 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.538062 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 15:39:30.538162 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.538067 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 15:39:30.541887 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.541870 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-73cjef5q5dpua\"" Apr 21 15:39:30.542123 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.542110 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-rfndn\"" Apr 21 15:39:30.544654 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.544638 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 15:39:30.544784 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.544745 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 15:39:30.544889 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.544873 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 15:39:30.544972 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.544879 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 15:39:30.545253 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.545239 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 15:39:30.545322 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.545308 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 15:39:30.545374 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.545321 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 15:39:30.545374 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.545325 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 15:39:30.551414 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.551396 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 15:39:30.551785 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.551770 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 15:39:30.578850 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.578830 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:39:30.633060 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633034 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc2492e9-fd61-4d7d-ab06-3411dd90d927-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.633150 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633062 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.633150 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633079 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q244\" (UniqueName: \"kubernetes.io/projected/fc2492e9-fd61-4d7d-ab06-3411dd90d927-kube-api-access-4q244\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.633150 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633098 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fc2492e9-fd61-4d7d-ab06-3411dd90d927-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.633150 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633134 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.633305 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633156 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.633305 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633171 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc2492e9-fd61-4d7d-ab06-3411dd90d927-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.633305 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633209 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.633305 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633247 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-config\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.633305 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633262 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fc2492e9-fd61-4d7d-ab06-3411dd90d927-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.633475 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633310 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-web-config\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.633475 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633329 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.633475 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633348 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.633475 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633392 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc2492e9-fd61-4d7d-ab06-3411dd90d927-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.633475 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633425 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fc2492e9-fd61-4d7d-ab06-3411dd90d927-config-out\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.633475 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633442 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc2492e9-fd61-4d7d-ab06-3411dd90d927-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.633649 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633486 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.633649 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.633517 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fc2492e9-fd61-4d7d-ab06-3411dd90d927-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.734330 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.734256 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc2492e9-fd61-4d7d-ab06-3411dd90d927-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.734330 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.734287 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.734330 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.734303 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4q244\" (UniqueName: \"kubernetes.io/projected/fc2492e9-fd61-4d7d-ab06-3411dd90d927-kube-api-access-4q244\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.734550 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.734333 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fc2492e9-fd61-4d7d-ab06-3411dd90d927-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.734550 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.734481 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.734550 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.734529 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.734778 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.734560 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc2492e9-fd61-4d7d-ab06-3411dd90d927-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.734778 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.734614 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.734778 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.734662 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-config\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.735386 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.735104 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc2492e9-fd61-4d7d-ab06-3411dd90d927-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.735386 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.735166 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fc2492e9-fd61-4d7d-ab06-3411dd90d927-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.735386 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.735206 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-web-config\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.735386 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.735236 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.735386 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.735280 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.735386 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.735310 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc2492e9-fd61-4d7d-ab06-3411dd90d927-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.735386 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.735306 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc2492e9-fd61-4d7d-ab06-3411dd90d927-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.735386 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.735337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fc2492e9-fd61-4d7d-ab06-3411dd90d927-config-out\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.735386 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.735365 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc2492e9-fd61-4d7d-ab06-3411dd90d927-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.736105 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.735405 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.736105 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.735453 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fc2492e9-fd61-4d7d-ab06-3411dd90d927-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.736105 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.735766 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fc2492e9-fd61-4d7d-ab06-3411dd90d927-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.739590 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.739562 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc2492e9-fd61-4d7d-ab06-3411dd90d927-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.740040 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.740016 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.740636 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.740452 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-config\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.740636 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.740476 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fc2492e9-fd61-4d7d-ab06-3411dd90d927-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.741143 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.741058 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.741524 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.741498 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.741604 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.741572 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc2492e9-fd61-4d7d-ab06-3411dd90d927-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.744771 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.742354 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fc2492e9-fd61-4d7d-ab06-3411dd90d927-config-out\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.744771 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.743549 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fc2492e9-fd61-4d7d-ab06-3411dd90d927-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.744771 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.744453 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.744989 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.744866 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.747174 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.745584 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.748295 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.748275 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.749268 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.749245 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fc2492e9-fd61-4d7d-ab06-3411dd90d927-web-config\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.749597 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.749572 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q244\" (UniqueName: \"kubernetes.io/projected/fc2492e9-fd61-4d7d-ab06-3411dd90d927-kube-api-access-4q244\") pod \"prometheus-k8s-0\" (UID: \"fc2492e9-fd61-4d7d-ab06-3411dd90d927\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.842153 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.842114 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:39:30.997582 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:30.997558 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:39:30.998159 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:39:30.998136 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc2492e9_fd61_4d7d_ab06_3411dd90d927.slice/crio-f272ef624656cd6c5f72e37dc91a20da08cca7bb2abb10385709106c06e5f717 WatchSource:0}: Error finding container f272ef624656cd6c5f72e37dc91a20da08cca7bb2abb10385709106c06e5f717: Status 404 returned error can't find the container with id f272ef624656cd6c5f72e37dc91a20da08cca7bb2abb10385709106c06e5f717 Apr 21 15:39:31.459747 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:31.459710 2579 generic.go:358] "Generic (PLEG): container finished" podID="fc2492e9-fd61-4d7d-ab06-3411dd90d927" containerID="8107dd431a7932bd26ee42fd0bac3f8344703565738f09559f4affe63346ccd6" exitCode=0 Apr 21 15:39:31.459906 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:31.459802 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fc2492e9-fd61-4d7d-ab06-3411dd90d927","Type":"ContainerDied","Data":"8107dd431a7932bd26ee42fd0bac3f8344703565738f09559f4affe63346ccd6"} Apr 21 15:39:31.459906 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:31.459840 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fc2492e9-fd61-4d7d-ab06-3411dd90d927","Type":"ContainerStarted","Data":"f272ef624656cd6c5f72e37dc91a20da08cca7bb2abb10385709106c06e5f717"} Apr 21 15:39:31.609402 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:31.609373 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d57ff44-23c3-46a1-aeb8-61fb85c14a53" path="/var/lib/kubelet/pods/9d57ff44-23c3-46a1-aeb8-61fb85c14a53/volumes" Apr 21 15:39:32.465517 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:32.465485 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fc2492e9-fd61-4d7d-ab06-3411dd90d927","Type":"ContainerStarted","Data":"8b3fede3eabf4891836fa079846d4f54a2c5751d12898236dc90aa30a04951d0"} Apr 21 15:39:32.465676 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:32.465524 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fc2492e9-fd61-4d7d-ab06-3411dd90d927","Type":"ContainerStarted","Data":"56933832a1d9d99ff0f8ce18e7039394d2d1e68e3be43aa5355369592756e800"} Apr 21 15:39:32.465676 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:32.465542 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fc2492e9-fd61-4d7d-ab06-3411dd90d927","Type":"ContainerStarted","Data":"771ddab83765cc85f469a8269343d701b4eb8fba15e84fc29a2f93711fe6f58d"} Apr 21 15:39:32.465676 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:32.465555 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fc2492e9-fd61-4d7d-ab06-3411dd90d927","Type":"ContainerStarted","Data":"3c098af9e2be76627fcbf257f24e519c0ae74f99cfdd26c51343535a6f39be66"} Apr 21 15:39:32.465676 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:32.465566 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fc2492e9-fd61-4d7d-ab06-3411dd90d927","Type":"ContainerStarted","Data":"8cee61a3cca611052599a734c50c3031b12b0cc3b5a002fb4a380370b995710d"} Apr 21 15:39:32.465676 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:32.465581 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fc2492e9-fd61-4d7d-ab06-3411dd90d927","Type":"ContainerStarted","Data":"56f464d38b19a7eb1986977dd67631833a7f688e96cb21d8ceff7c3cdcab7af0"} Apr 21 15:39:32.467211 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:32.467187 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" event={"ID":"92ad2768-ab1b-4047-9499-92f73e6e7306","Type":"ContainerStarted","Data":"2d02955ff0223569414df0fd74f875aa432b454f339c07aec44c77413f7bbf04"} Apr 21 15:39:32.467211 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:32.467213 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" event={"ID":"92ad2768-ab1b-4047-9499-92f73e6e7306","Type":"ContainerStarted","Data":"0d517a880c71c0d739234e55c714bce93d1562bacf0b796de3be63c56455b84e"} Apr 21 15:39:32.467382 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:32.467223 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" event={"ID":"92ad2768-ab1b-4047-9499-92f73e6e7306","Type":"ContainerStarted","Data":"428b63822c4633f822e4dff6eec106176c831bad4957357f10c5dda2369a39cf"} Apr 21 15:39:32.504415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:32.504351 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.504324998 podStartE2EDuration="2.504324998s" podCreationTimestamp="2026-04-21 15:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:39:32.50167127 +0000 UTC m=+237.529838063" watchObservedRunningTime="2026-04-21 15:39:32.504324998 +0000 UTC m=+237.532491791" Apr 21 15:39:32.528600 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:32.528548 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-64cc55cd6f-7hkkq" podStartSLOduration=1.415289287 podStartE2EDuration="3.528533452s" podCreationTimestamp="2026-04-21 15:39:29 +0000 UTC" firstStartedPulling="2026-04-21 15:39:29.502213269 +0000 UTC m=+234.530380055" lastFinishedPulling="2026-04-21 15:39:31.615457444 +0000 UTC m=+236.643624220" observedRunningTime="2026-04-21 15:39:32.525684613 +0000 UTC m=+237.553851407" watchObservedRunningTime="2026-04-21 15:39:32.528533452 +0000 UTC m=+237.556700267" Apr 21 15:39:35.842537 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:39:35.842509 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:40:30.842910 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:40:30.842881 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:40:30.858993 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:40:30.858969 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:40:31.659538 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:40:31.659510 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:40:35.485578 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:40:35.485544 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zf8zt_a1f66133-b065-4100-b368-ac1f349bf896/cluster-monitoring-operator/0.log" Apr 21 15:40:35.486309 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:40:35.486285 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zf8zt_a1f66133-b065-4100-b368-ac1f349bf896/cluster-monitoring-operator/0.log" Apr 21 15:40:35.488162 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:40:35.488141 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/2.log" Apr 21 15:40:35.488853 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:40:35.488820 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/2.log" Apr 21 15:40:35.493214 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:40:35.493190 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/0.log" Apr 21 15:40:35.493901 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:40:35.493878 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/0.log" Apr 21 15:41:22.832342 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:22.832245 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-94vp4"] Apr 21 15:41:22.835818 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:22.835795 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94vp4" Apr 21 15:41:22.838992 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:22.838972 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 15:41:22.863663 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:22.863636 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-94vp4"] Apr 21 15:41:22.924288 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:22.924247 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd-original-pull-secret\") pod \"global-pull-secret-syncer-94vp4\" (UID: \"f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd\") " pod="kube-system/global-pull-secret-syncer-94vp4" Apr 21 15:41:22.924469 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:22.924300 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd-kubelet-config\") pod \"global-pull-secret-syncer-94vp4\" (UID: \"f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd\") " pod="kube-system/global-pull-secret-syncer-94vp4" Apr 21 15:41:22.924469 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:22.924354 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd-dbus\") pod \"global-pull-secret-syncer-94vp4\" (UID: \"f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd\") " pod="kube-system/global-pull-secret-syncer-94vp4" Apr 21 15:41:23.025479 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:23.025427 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd-kubelet-config\") pod \"global-pull-secret-syncer-94vp4\" (UID: \"f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd\") " pod="kube-system/global-pull-secret-syncer-94vp4" Apr 21 15:41:23.025672 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:23.025495 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd-dbus\") pod \"global-pull-secret-syncer-94vp4\" (UID: \"f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd\") " pod="kube-system/global-pull-secret-syncer-94vp4" Apr 21 15:41:23.025672 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:23.025558 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd-original-pull-secret\") pod \"global-pull-secret-syncer-94vp4\" (UID: \"f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd\") " pod="kube-system/global-pull-secret-syncer-94vp4" Apr 21 15:41:23.025672 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:23.025571 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd-kubelet-config\") pod \"global-pull-secret-syncer-94vp4\" (UID: \"f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd\") " pod="kube-system/global-pull-secret-syncer-94vp4" Apr 21 15:41:23.025779 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:23.025714 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd-dbus\") pod \"global-pull-secret-syncer-94vp4\" (UID: \"f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd\") " pod="kube-system/global-pull-secret-syncer-94vp4" Apr 21 15:41:23.028129 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:23.028107 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd-original-pull-secret\") pod \"global-pull-secret-syncer-94vp4\" (UID: \"f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd\") " pod="kube-system/global-pull-secret-syncer-94vp4" Apr 21 15:41:23.145077 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:23.144993 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94vp4" Apr 21 15:41:23.273855 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:23.273819 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-94vp4"] Apr 21 15:41:23.277109 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:41:23.277069 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf45fe3a8_4ce7_49d1_8a7a_ca68d2f07acd.slice/crio-d444e92d3fc2c9c6f03c6b36a1cd71585c402f0ddf83d6d3ebd7a4dea684f068 WatchSource:0}: Error finding container d444e92d3fc2c9c6f03c6b36a1cd71585c402f0ddf83d6d3ebd7a4dea684f068: Status 404 returned error can't find the container with id d444e92d3fc2c9c6f03c6b36a1cd71585c402f0ddf83d6d3ebd7a4dea684f068 Apr 21 15:41:23.278779 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:23.278762 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:41:23.798288 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:23.798251 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-94vp4" event={"ID":"f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd","Type":"ContainerStarted","Data":"d444e92d3fc2c9c6f03c6b36a1cd71585c402f0ddf83d6d3ebd7a4dea684f068"} Apr 21 15:41:28.814813 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:28.814775 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-94vp4" event={"ID":"f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd","Type":"ContainerStarted","Data":"a2191a8e505e474964dc8aa46e8b080f1f24ce2bf97bf514c3cf6a4bc1ce9e5e"} Apr 21 15:41:28.833208 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:41:28.833156 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-94vp4" podStartSLOduration=2.378786356 podStartE2EDuration="6.833142399s" podCreationTimestamp="2026-04-21 15:41:22 +0000 UTC" firstStartedPulling="2026-04-21 15:41:23.278891961 +0000 UTC m=+348.307058736" lastFinishedPulling="2026-04-21 15:41:27.733248003 +0000 UTC m=+352.761414779" observedRunningTime="2026-04-21 15:41:28.830665725 +0000 UTC m=+353.858832520" watchObservedRunningTime="2026-04-21 15:41:28.833142399 +0000 UTC m=+353.861309195" Apr 21 15:43:52.157717 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:52.157682 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-tvm5d"] Apr 21 15:43:52.161122 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:52.161105 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-tvm5d" Apr 21 15:43:52.165107 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:52.165077 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 15:43:52.165228 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:52.165142 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-7hbdx\"" Apr 21 15:43:52.165228 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:52.165083 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 15:43:52.176074 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:52.176053 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-tvm5d"] Apr 21 15:43:52.250370 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:52.250341 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e716138b-5f19-4f99-af6e-cce457d69633-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-tvm5d\" (UID: \"e716138b-5f19-4f99-af6e-cce457d69633\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tvm5d" Apr 21 15:43:52.250531 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:52.250396 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65nhh\" (UniqueName: \"kubernetes.io/projected/e716138b-5f19-4f99-af6e-cce457d69633-kube-api-access-65nhh\") pod \"cert-manager-webhook-587ccfb98-tvm5d\" (UID: \"e716138b-5f19-4f99-af6e-cce457d69633\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tvm5d" Apr 21 15:43:52.351437 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:52.351402 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65nhh\" (UniqueName: \"kubernetes.io/projected/e716138b-5f19-4f99-af6e-cce457d69633-kube-api-access-65nhh\") pod \"cert-manager-webhook-587ccfb98-tvm5d\" (UID: \"e716138b-5f19-4f99-af6e-cce457d69633\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tvm5d" Apr 21 15:43:52.351607 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:52.351500 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e716138b-5f19-4f99-af6e-cce457d69633-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-tvm5d\" (UID: \"e716138b-5f19-4f99-af6e-cce457d69633\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tvm5d" Apr 21 15:43:52.361982 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:52.361953 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e716138b-5f19-4f99-af6e-cce457d69633-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-tvm5d\" (UID: \"e716138b-5f19-4f99-af6e-cce457d69633\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tvm5d" Apr 21 15:43:52.362203 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:52.362182 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65nhh\" (UniqueName: \"kubernetes.io/projected/e716138b-5f19-4f99-af6e-cce457d69633-kube-api-access-65nhh\") pod \"cert-manager-webhook-587ccfb98-tvm5d\" (UID: \"e716138b-5f19-4f99-af6e-cce457d69633\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tvm5d" Apr 21 15:43:52.475629 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:52.475542 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-tvm5d" Apr 21 15:43:52.621375 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:52.621343 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-tvm5d"] Apr 21 15:43:52.624121 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:43:52.624087 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode716138b_5f19_4f99_af6e_cce457d69633.slice/crio-e16a9146b1f94023179aba012a9952b46f9fd5757708f8dd8e34df79464c8641 WatchSource:0}: Error finding container e16a9146b1f94023179aba012a9952b46f9fd5757708f8dd8e34df79464c8641: Status 404 returned error can't find the container with id e16a9146b1f94023179aba012a9952b46f9fd5757708f8dd8e34df79464c8641 Apr 21 15:43:53.242769 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:53.242735 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-tvm5d" event={"ID":"e716138b-5f19-4f99-af6e-cce457d69633","Type":"ContainerStarted","Data":"e16a9146b1f94023179aba012a9952b46f9fd5757708f8dd8e34df79464c8641"} Apr 21 15:43:55.641892 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:55.641859 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-lxlv6"] Apr 21 15:43:55.645799 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:55.645779 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-lxlv6" Apr 21 15:43:55.649776 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:55.649753 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-2g6d6\"" Apr 21 15:43:55.651348 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:55.651309 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-lxlv6"] Apr 21 15:43:55.784241 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:55.784207 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wds52\" (UniqueName: \"kubernetes.io/projected/c8854e62-ad20-401b-aa3d-20869736ac27-kube-api-access-wds52\") pod \"cert-manager-cainjector-68b757865b-lxlv6\" (UID: \"c8854e62-ad20-401b-aa3d-20869736ac27\") " pod="cert-manager/cert-manager-cainjector-68b757865b-lxlv6" Apr 21 15:43:55.784346 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:55.784286 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8854e62-ad20-401b-aa3d-20869736ac27-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-lxlv6\" (UID: \"c8854e62-ad20-401b-aa3d-20869736ac27\") " pod="cert-manager/cert-manager-cainjector-68b757865b-lxlv6" Apr 21 15:43:55.885302 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:55.885275 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8854e62-ad20-401b-aa3d-20869736ac27-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-lxlv6\" (UID: \"c8854e62-ad20-401b-aa3d-20869736ac27\") " pod="cert-manager/cert-manager-cainjector-68b757865b-lxlv6" Apr 21 15:43:55.885415 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:55.885366 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wds52\" (UniqueName: \"kubernetes.io/projected/c8854e62-ad20-401b-aa3d-20869736ac27-kube-api-access-wds52\") pod \"cert-manager-cainjector-68b757865b-lxlv6\" (UID: \"c8854e62-ad20-401b-aa3d-20869736ac27\") " pod="cert-manager/cert-manager-cainjector-68b757865b-lxlv6" Apr 21 15:43:55.897193 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:55.897136 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8854e62-ad20-401b-aa3d-20869736ac27-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-lxlv6\" (UID: \"c8854e62-ad20-401b-aa3d-20869736ac27\") " pod="cert-manager/cert-manager-cainjector-68b757865b-lxlv6" Apr 21 15:43:55.897429 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:55.897407 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wds52\" (UniqueName: \"kubernetes.io/projected/c8854e62-ad20-401b-aa3d-20869736ac27-kube-api-access-wds52\") pod \"cert-manager-cainjector-68b757865b-lxlv6\" (UID: \"c8854e62-ad20-401b-aa3d-20869736ac27\") " pod="cert-manager/cert-manager-cainjector-68b757865b-lxlv6" Apr 21 15:43:55.957361 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:55.957336 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-lxlv6" Apr 21 15:43:56.085712 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:56.085689 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-lxlv6"] Apr 21 15:43:56.088316 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:43:56.088293 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8854e62_ad20_401b_aa3d_20869736ac27.slice/crio-8c62c3148b79e0adfc3a5f691e3e8583e9270e58db04985d2fec1710b594f22e WatchSource:0}: Error finding container 8c62c3148b79e0adfc3a5f691e3e8583e9270e58db04985d2fec1710b594f22e: Status 404 returned error can't find the container with id 8c62c3148b79e0adfc3a5f691e3e8583e9270e58db04985d2fec1710b594f22e Apr 21 15:43:56.254040 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:56.253920 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-tvm5d" event={"ID":"e716138b-5f19-4f99-af6e-cce457d69633","Type":"ContainerStarted","Data":"1b9da97557d2c069fef6806f8b4af5d73620ba85a87443c02350f33d1192ed03"} Apr 21 15:43:56.254210 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:56.254120 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-tvm5d" Apr 21 15:43:56.255452 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:56.255428 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-lxlv6" event={"ID":"c8854e62-ad20-401b-aa3d-20869736ac27","Type":"ContainerStarted","Data":"37907f4c6a9c41b7e92e825dbf0c280c2ae71df6029d2a2462d86098ac411347"} Apr 21 15:43:56.255452 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:56.255456 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-lxlv6" event={"ID":"c8854e62-ad20-401b-aa3d-20869736ac27","Type":"ContainerStarted","Data":"8c62c3148b79e0adfc3a5f691e3e8583e9270e58db04985d2fec1710b594f22e"} Apr 21 15:43:56.292699 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:56.292647 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-tvm5d" podStartSLOduration=1.137119368 podStartE2EDuration="4.292631001s" podCreationTimestamp="2026-04-21 15:43:52 +0000 UTC" firstStartedPulling="2026-04-21 15:43:52.625888905 +0000 UTC m=+497.654055679" lastFinishedPulling="2026-04-21 15:43:55.781400535 +0000 UTC m=+500.809567312" observedRunningTime="2026-04-21 15:43:56.270851039 +0000 UTC m=+501.299017837" watchObservedRunningTime="2026-04-21 15:43:56.292631001 +0000 UTC m=+501.320797798" Apr 21 15:43:56.293484 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:43:56.293448 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-lxlv6" podStartSLOduration=1.29343609 podStartE2EDuration="1.29343609s" podCreationTimestamp="2026-04-21 15:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:43:56.292427208 +0000 UTC m=+501.320594000" watchObservedRunningTime="2026-04-21 15:43:56.29343609 +0000 UTC m=+501.321602885" Apr 21 15:44:02.259760 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:02.259732 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-tvm5d" Apr 21 15:44:07.391768 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:07.391729 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-88zgd"] Apr 21 15:44:07.395427 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:07.395406 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-88zgd" Apr 21 15:44:07.400407 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:07.400388 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 15:44:07.401356 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:07.401335 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:44:07.401471 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:07.401338 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-6hbfn\"" Apr 21 15:44:07.414766 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:07.414739 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-88zgd"] Apr 21 15:44:07.579003 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:07.578968 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpr4f\" (UniqueName: \"kubernetes.io/projected/a5323695-5014-4f0d-923c-aca1fcce7dd1-kube-api-access-gpr4f\") pod \"openshift-lws-operator-bfc7f696d-88zgd\" (UID: \"a5323695-5014-4f0d-923c-aca1fcce7dd1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-88zgd" Apr 21 15:44:07.579163 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:07.579052 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a5323695-5014-4f0d-923c-aca1fcce7dd1-tmp\") pod \"openshift-lws-operator-bfc7f696d-88zgd\" (UID: \"a5323695-5014-4f0d-923c-aca1fcce7dd1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-88zgd" Apr 21 15:44:07.680539 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:07.680464 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpr4f\" (UniqueName: \"kubernetes.io/projected/a5323695-5014-4f0d-923c-aca1fcce7dd1-kube-api-access-gpr4f\") pod \"openshift-lws-operator-bfc7f696d-88zgd\" (UID: \"a5323695-5014-4f0d-923c-aca1fcce7dd1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-88zgd" Apr 21 15:44:07.680539 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:07.680504 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a5323695-5014-4f0d-923c-aca1fcce7dd1-tmp\") pod \"openshift-lws-operator-bfc7f696d-88zgd\" (UID: \"a5323695-5014-4f0d-923c-aca1fcce7dd1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-88zgd" Apr 21 15:44:07.680798 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:07.680784 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a5323695-5014-4f0d-923c-aca1fcce7dd1-tmp\") pod \"openshift-lws-operator-bfc7f696d-88zgd\" (UID: \"a5323695-5014-4f0d-923c-aca1fcce7dd1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-88zgd" Apr 21 15:44:07.695195 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:07.695163 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpr4f\" (UniqueName: \"kubernetes.io/projected/a5323695-5014-4f0d-923c-aca1fcce7dd1-kube-api-access-gpr4f\") pod \"openshift-lws-operator-bfc7f696d-88zgd\" (UID: \"a5323695-5014-4f0d-923c-aca1fcce7dd1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-88zgd" Apr 21 15:44:07.704806 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:07.704787 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-88zgd" Apr 21 15:44:07.843016 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:07.842986 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-88zgd"] Apr 21 15:44:07.846651 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:44:07.846623 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5323695_5014_4f0d_923c_aca1fcce7dd1.slice/crio-09ca47dfc26de2a75a98a324f8291fa9fad8b7ba0a26165724b6921da72c1d0e WatchSource:0}: Error finding container 09ca47dfc26de2a75a98a324f8291fa9fad8b7ba0a26165724b6921da72c1d0e: Status 404 returned error can't find the container with id 09ca47dfc26de2a75a98a324f8291fa9fad8b7ba0a26165724b6921da72c1d0e Apr 21 15:44:08.292268 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:08.292209 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-88zgd" event={"ID":"a5323695-5014-4f0d-923c-aca1fcce7dd1","Type":"ContainerStarted","Data":"09ca47dfc26de2a75a98a324f8291fa9fad8b7ba0a26165724b6921da72c1d0e"} Apr 21 15:44:10.302819 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:10.302726 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-88zgd" event={"ID":"a5323695-5014-4f0d-923c-aca1fcce7dd1","Type":"ContainerStarted","Data":"56c3ad2b050032cb9f9766bcac3ccab8d8e9835299e69d71f4834a59767f0e5a"} Apr 21 15:44:10.326268 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:10.326199 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-88zgd" podStartSLOduration=1.162476587 podStartE2EDuration="3.326180349s" podCreationTimestamp="2026-04-21 15:44:07 +0000 UTC" firstStartedPulling="2026-04-21 15:44:07.848631464 +0000 UTC m=+512.876798242" lastFinishedPulling="2026-04-21 15:44:10.012335223 +0000 UTC m=+515.040502004" observedRunningTime="2026-04-21 15:44:10.322644944 +0000 UTC m=+515.350811740" watchObservedRunningTime="2026-04-21 15:44:10.326180349 +0000 UTC m=+515.354347146" Apr 21 15:44:26.048421 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.048341 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-fb974466f-775wj"] Apr 21 15:44:26.112196 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.112170 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fb974466f-775wj"] Apr 21 15:44:26.112348 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.112292 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" Apr 21 15:44:26.120688 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.120664 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 15:44:26.120989 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.120967 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-5d72j\"" Apr 21 15:44:26.121758 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.121744 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 15:44:26.121808 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.121745 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 15:44:26.213375 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.213349 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0b0f5d96-5050-40ae-bf1c-5f10356839c7-manager-config\") pod \"lws-controller-manager-fb974466f-775wj\" (UID: \"0b0f5d96-5050-40ae-bf1c-5f10356839c7\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" Apr 21 15:44:26.213507 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.213382 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0b0f5d96-5050-40ae-bf1c-5f10356839c7-metrics-cert\") pod \"lws-controller-manager-fb974466f-775wj\" (UID: \"0b0f5d96-5050-40ae-bf1c-5f10356839c7\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" Apr 21 15:44:26.213507 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.213415 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zhd9\" (UniqueName: \"kubernetes.io/projected/0b0f5d96-5050-40ae-bf1c-5f10356839c7-kube-api-access-6zhd9\") pod \"lws-controller-manager-fb974466f-775wj\" (UID: \"0b0f5d96-5050-40ae-bf1c-5f10356839c7\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" Apr 21 15:44:26.213507 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.213485 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b0f5d96-5050-40ae-bf1c-5f10356839c7-cert\") pod \"lws-controller-manager-fb974466f-775wj\" (UID: \"0b0f5d96-5050-40ae-bf1c-5f10356839c7\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" Apr 21 15:44:26.314045 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.313980 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0b0f5d96-5050-40ae-bf1c-5f10356839c7-metrics-cert\") pod \"lws-controller-manager-fb974466f-775wj\" (UID: \"0b0f5d96-5050-40ae-bf1c-5f10356839c7\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" Apr 21 15:44:26.314045 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.314028 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zhd9\" (UniqueName: \"kubernetes.io/projected/0b0f5d96-5050-40ae-bf1c-5f10356839c7-kube-api-access-6zhd9\") pod \"lws-controller-manager-fb974466f-775wj\" (UID: \"0b0f5d96-5050-40ae-bf1c-5f10356839c7\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" Apr 21 15:44:26.314248 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.314060 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b0f5d96-5050-40ae-bf1c-5f10356839c7-cert\") pod \"lws-controller-manager-fb974466f-775wj\" (UID: \"0b0f5d96-5050-40ae-bf1c-5f10356839c7\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" Apr 21 15:44:26.314248 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.314114 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0b0f5d96-5050-40ae-bf1c-5f10356839c7-manager-config\") pod \"lws-controller-manager-fb974466f-775wj\" (UID: \"0b0f5d96-5050-40ae-bf1c-5f10356839c7\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" Apr 21 15:44:26.314759 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.314738 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0b0f5d96-5050-40ae-bf1c-5f10356839c7-manager-config\") pod \"lws-controller-manager-fb974466f-775wj\" (UID: \"0b0f5d96-5050-40ae-bf1c-5f10356839c7\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" Apr 21 15:44:26.316604 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.316584 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0b0f5d96-5050-40ae-bf1c-5f10356839c7-metrics-cert\") pod \"lws-controller-manager-fb974466f-775wj\" (UID: \"0b0f5d96-5050-40ae-bf1c-5f10356839c7\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" Apr 21 15:44:26.316720 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.316703 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b0f5d96-5050-40ae-bf1c-5f10356839c7-cert\") pod \"lws-controller-manager-fb974466f-775wj\" (UID: \"0b0f5d96-5050-40ae-bf1c-5f10356839c7\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" Apr 21 15:44:26.340910 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.340884 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zhd9\" (UniqueName: \"kubernetes.io/projected/0b0f5d96-5050-40ae-bf1c-5f10356839c7-kube-api-access-6zhd9\") pod \"lws-controller-manager-fb974466f-775wj\" (UID: \"0b0f5d96-5050-40ae-bf1c-5f10356839c7\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" Apr 21 15:44:26.422491 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.422464 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" Apr 21 15:44:26.574550 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:26.574524 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fb974466f-775wj"] Apr 21 15:44:26.577023 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:44:26.576994 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b0f5d96_5050_40ae_bf1c_5f10356839c7.slice/crio-a213d517abdd196bb6be6d1dbb276c4ed09504f8f7c39d4be090de0abc98f1cd WatchSource:0}: Error finding container a213d517abdd196bb6be6d1dbb276c4ed09504f8f7c39d4be090de0abc98f1cd: Status 404 returned error can't find the container with id a213d517abdd196bb6be6d1dbb276c4ed09504f8f7c39d4be090de0abc98f1cd Apr 21 15:44:27.364056 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:27.364016 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" event={"ID":"0b0f5d96-5050-40ae-bf1c-5f10356839c7","Type":"ContainerStarted","Data":"a213d517abdd196bb6be6d1dbb276c4ed09504f8f7c39d4be090de0abc98f1cd"} Apr 21 15:44:28.368951 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:28.368916 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" event={"ID":"0b0f5d96-5050-40ae-bf1c-5f10356839c7","Type":"ContainerStarted","Data":"5117f1f1190e346ea84ed4a62514e5de15c3eeabfebe6ea7cf387b6b34d4979f"} Apr 21 15:44:28.369349 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:28.369013 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" Apr 21 15:44:28.410108 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:28.410053 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" podStartSLOduration=0.997246497 podStartE2EDuration="2.41003839s" podCreationTimestamp="2026-04-21 15:44:26 +0000 UTC" firstStartedPulling="2026-04-21 15:44:26.578802087 +0000 UTC m=+531.606968861" lastFinishedPulling="2026-04-21 15:44:27.99159397 +0000 UTC m=+533.019760754" observedRunningTime="2026-04-21 15:44:28.409328914 +0000 UTC m=+533.437495709" watchObservedRunningTime="2026-04-21 15:44:28.41003839 +0000 UTC m=+533.438205187" Apr 21 15:44:39.374557 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:44:39.374529 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-fb974466f-775wj" Apr 21 15:45:12.890422 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:12.890383 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-92f7f"] Apr 21 15:45:12.894750 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:12.894733 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-92f7f" Apr 21 15:45:12.898982 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:12.898961 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 15:45:12.899119 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:12.899096 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 21 15:45:12.899385 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:12.899369 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 15:45:12.900064 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:12.900049 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-mw7dj\"" Apr 21 15:45:12.913760 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:12.913735 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-92f7f"] Apr 21 15:45:12.917468 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:12.917450 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hpmm\" (UniqueName: \"kubernetes.io/projected/d0892fa6-d3bc-4b17-a57e-bade8ade4cf1-kube-api-access-4hpmm\") pod \"dns-operator-controller-manager-844548ff4c-92f7f\" (UID: \"d0892fa6-d3bc-4b17-a57e-bade8ade4cf1\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-92f7f" Apr 21 15:45:13.018417 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:13.018394 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hpmm\" (UniqueName: \"kubernetes.io/projected/d0892fa6-d3bc-4b17-a57e-bade8ade4cf1-kube-api-access-4hpmm\") pod \"dns-operator-controller-manager-844548ff4c-92f7f\" (UID: \"d0892fa6-d3bc-4b17-a57e-bade8ade4cf1\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-92f7f" Apr 21 15:45:13.031267 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:13.031241 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hpmm\" (UniqueName: \"kubernetes.io/projected/d0892fa6-d3bc-4b17-a57e-bade8ade4cf1-kube-api-access-4hpmm\") pod \"dns-operator-controller-manager-844548ff4c-92f7f\" (UID: \"d0892fa6-d3bc-4b17-a57e-bade8ade4cf1\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-92f7f" Apr 21 15:45:13.205662 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:13.205611 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-92f7f" Apr 21 15:45:13.335764 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:13.335739 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-92f7f"] Apr 21 15:45:13.338193 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:45:13.338164 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0892fa6_d3bc_4b17_a57e_bade8ade4cf1.slice/crio-438fb07fe7e7db21ca796b4442f2c7967e7ce9613c8082e61d2353d1ccf64c89 WatchSource:0}: Error finding container 438fb07fe7e7db21ca796b4442f2c7967e7ce9613c8082e61d2353d1ccf64c89: Status 404 returned error can't find the container with id 438fb07fe7e7db21ca796b4442f2c7967e7ce9613c8082e61d2353d1ccf64c89 Apr 21 15:45:13.531364 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:13.531298 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-92f7f" event={"ID":"d0892fa6-d3bc-4b17-a57e-bade8ade4cf1","Type":"ContainerStarted","Data":"438fb07fe7e7db21ca796b4442f2c7967e7ce9613c8082e61d2353d1ccf64c89"} Apr 21 15:45:16.545162 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:16.545124 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-92f7f" event={"ID":"d0892fa6-d3bc-4b17-a57e-bade8ade4cf1","Type":"ContainerStarted","Data":"0e68af673ea0e591bce2f5a811316241713f4276a7f086885a965e8a89af4d81"} Apr 21 15:45:16.545615 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:16.545228 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-92f7f" Apr 21 15:45:16.567200 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:16.567103 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-92f7f" podStartSLOduration=2.145251534 podStartE2EDuration="4.567087593s" podCreationTimestamp="2026-04-21 15:45:12 +0000 UTC" firstStartedPulling="2026-04-21 15:45:13.340072294 +0000 UTC m=+578.368239068" lastFinishedPulling="2026-04-21 15:45:15.761908341 +0000 UTC m=+580.790075127" observedRunningTime="2026-04-21 15:45:16.566467219 +0000 UTC m=+581.594634015" watchObservedRunningTime="2026-04-21 15:45:16.567087593 +0000 UTC m=+581.595254390" Apr 21 15:45:21.499202 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:21.499167 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9sfqk"] Apr 21 15:45:21.502565 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:21.502548 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9sfqk" Apr 21 15:45:21.505615 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:21.505596 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-2qtww\"" Apr 21 15:45:21.544684 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:21.544662 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9sfqk"] Apr 21 15:45:21.589209 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:21.589186 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdtc7\" (UniqueName: \"kubernetes.io/projected/6eedf922-1485-472a-b11d-60d6d59b53e3-kube-api-access-pdtc7\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-9sfqk\" (UID: \"6eedf922-1485-472a-b11d-60d6d59b53e3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9sfqk" Apr 21 15:45:21.589321 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:21.589243 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6eedf922-1485-472a-b11d-60d6d59b53e3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-9sfqk\" (UID: \"6eedf922-1485-472a-b11d-60d6d59b53e3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9sfqk" Apr 21 15:45:21.689919 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:21.689893 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6eedf922-1485-472a-b11d-60d6d59b53e3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-9sfqk\" (UID: \"6eedf922-1485-472a-b11d-60d6d59b53e3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9sfqk" Apr 21 15:45:21.690054 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:21.690007 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdtc7\" (UniqueName: \"kubernetes.io/projected/6eedf922-1485-472a-b11d-60d6d59b53e3-kube-api-access-pdtc7\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-9sfqk\" (UID: \"6eedf922-1485-472a-b11d-60d6d59b53e3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9sfqk" Apr 21 15:45:21.690316 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:21.690293 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6eedf922-1485-472a-b11d-60d6d59b53e3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-9sfqk\" (UID: \"6eedf922-1485-472a-b11d-60d6d59b53e3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9sfqk" Apr 21 15:45:21.699619 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:21.699600 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdtc7\" (UniqueName: \"kubernetes.io/projected/6eedf922-1485-472a-b11d-60d6d59b53e3-kube-api-access-pdtc7\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-9sfqk\" (UID: \"6eedf922-1485-472a-b11d-60d6d59b53e3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9sfqk" Apr 21 15:45:21.812343 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:21.812265 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9sfqk" Apr 21 15:45:21.947022 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:21.946840 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9sfqk"] Apr 21 15:45:21.950153 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:45:21.950129 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eedf922_1485_472a_b11d_60d6d59b53e3.slice/crio-8fcefbdbf486db6ccb31a89251e3dd3ada0b9c2569b840e58267afa97aa1f1e7 WatchSource:0}: Error finding container 8fcefbdbf486db6ccb31a89251e3dd3ada0b9c2569b840e58267afa97aa1f1e7: Status 404 returned error can't find the container with id 8fcefbdbf486db6ccb31a89251e3dd3ada0b9c2569b840e58267afa97aa1f1e7 Apr 21 15:45:22.566387 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:22.566351 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9sfqk" event={"ID":"6eedf922-1485-472a-b11d-60d6d59b53e3","Type":"ContainerStarted","Data":"8fcefbdbf486db6ccb31a89251e3dd3ada0b9c2569b840e58267afa97aa1f1e7"} Apr 21 15:45:26.583199 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:26.583170 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9sfqk" event={"ID":"6eedf922-1485-472a-b11d-60d6d59b53e3","Type":"ContainerStarted","Data":"a1def60c14eca19afa92ac7290223baaa5d8bbd3d7f965e64abd55bcd36fa5ee"} Apr 21 15:45:26.583539 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:26.583301 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9sfqk" Apr 21 15:45:26.607706 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:26.607658 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9sfqk" podStartSLOduration=1.121376221 podStartE2EDuration="5.607643757s" podCreationTimestamp="2026-04-21 15:45:21 +0000 UTC" firstStartedPulling="2026-04-21 15:45:21.952409198 +0000 UTC m=+586.980575972" lastFinishedPulling="2026-04-21 15:45:26.438676731 +0000 UTC m=+591.466843508" observedRunningTime="2026-04-21 15:45:26.60364436 +0000 UTC m=+591.631811156" watchObservedRunningTime="2026-04-21 15:45:26.607643757 +0000 UTC m=+591.635810553" Apr 21 15:45:27.551045 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:27.551011 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-92f7f" Apr 21 15:45:35.511871 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:35.511837 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zf8zt_a1f66133-b065-4100-b368-ac1f349bf896/cluster-monitoring-operator/0.log" Apr 21 15:45:35.512805 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:35.512784 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zf8zt_a1f66133-b065-4100-b368-ac1f349bf896/cluster-monitoring-operator/0.log" Apr 21 15:45:35.514207 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:35.514181 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/2.log" Apr 21 15:45:35.515068 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:35.515049 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/2.log" Apr 21 15:45:35.518930 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:35.518913 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/0.log" Apr 21 15:45:35.519699 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:35.519677 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/0.log" Apr 21 15:45:37.589872 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:45:37.589837 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9sfqk" Apr 21 15:46:11.487846 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:11.487769 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-xrrlz"] Apr 21 15:46:11.496641 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:11.496608 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-xrrlz" Apr 21 15:46:11.499137 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:11.499112 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-hs69b\"" Apr 21 15:46:11.501156 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:11.501128 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-xrrlz"] Apr 21 15:46:11.517959 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:11.517911 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l2js\" (UniqueName: \"kubernetes.io/projected/c20e6570-78f4-4318-bbb4-b22d01f2b864-kube-api-access-2l2js\") pod \"authorino-674b59b84c-xrrlz\" (UID: \"c20e6570-78f4-4318-bbb4-b22d01f2b864\") " pod="kuadrant-system/authorino-674b59b84c-xrrlz" Apr 21 15:46:11.619283 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:11.619244 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2l2js\" (UniqueName: \"kubernetes.io/projected/c20e6570-78f4-4318-bbb4-b22d01f2b864-kube-api-access-2l2js\") pod \"authorino-674b59b84c-xrrlz\" (UID: \"c20e6570-78f4-4318-bbb4-b22d01f2b864\") " pod="kuadrant-system/authorino-674b59b84c-xrrlz" Apr 21 15:46:11.627901 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:11.627864 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l2js\" (UniqueName: \"kubernetes.io/projected/c20e6570-78f4-4318-bbb4-b22d01f2b864-kube-api-access-2l2js\") pod \"authorino-674b59b84c-xrrlz\" (UID: \"c20e6570-78f4-4318-bbb4-b22d01f2b864\") " pod="kuadrant-system/authorino-674b59b84c-xrrlz" Apr 21 15:46:11.807654 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:11.807580 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-xrrlz" Apr 21 15:46:11.930963 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:11.930906 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-xrrlz"] Apr 21 15:46:11.934251 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:46:11.934225 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc20e6570_78f4_4318_bbb4_b22d01f2b864.slice/crio-3ba738ae55c2d7e89ef9eaae315221071b1813951381cca0f99a6221fa283ce4 WatchSource:0}: Error finding container 3ba738ae55c2d7e89ef9eaae315221071b1813951381cca0f99a6221fa283ce4: Status 404 returned error can't find the container with id 3ba738ae55c2d7e89ef9eaae315221071b1813951381cca0f99a6221fa283ce4 Apr 21 15:46:12.740257 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:12.740170 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-xrrlz" event={"ID":"c20e6570-78f4-4318-bbb4-b22d01f2b864","Type":"ContainerStarted","Data":"3ba738ae55c2d7e89ef9eaae315221071b1813951381cca0f99a6221fa283ce4"} Apr 21 15:46:14.748444 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:14.748403 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-xrrlz" event={"ID":"c20e6570-78f4-4318-bbb4-b22d01f2b864","Type":"ContainerStarted","Data":"1d087a42ef9c60697d675b69f352499833691cb66474b7f25f6719fef3edea55"} Apr 21 15:46:14.765600 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:14.765556 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-xrrlz" podStartSLOduration=1.780380896 podStartE2EDuration="3.765540866s" podCreationTimestamp="2026-04-21 15:46:11 +0000 UTC" firstStartedPulling="2026-04-21 15:46:11.935386613 +0000 UTC m=+636.963553388" lastFinishedPulling="2026-04-21 15:46:13.920546577 +0000 UTC m=+638.948713358" observedRunningTime="2026-04-21 15:46:14.763185568 +0000 UTC m=+639.791352364" watchObservedRunningTime="2026-04-21 15:46:14.765540866 +0000 UTC m=+639.793707853" Apr 21 15:46:16.339598 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:16.339562 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-xrrlz"] Apr 21 15:46:16.755842 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:16.755740 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-xrrlz" podUID="c20e6570-78f4-4318-bbb4-b22d01f2b864" containerName="authorino" containerID="cri-o://1d087a42ef9c60697d675b69f352499833691cb66474b7f25f6719fef3edea55" gracePeriod=30 Apr 21 15:46:16.997954 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:16.997914 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-xrrlz" Apr 21 15:46:17.068321 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:17.068257 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l2js\" (UniqueName: \"kubernetes.io/projected/c20e6570-78f4-4318-bbb4-b22d01f2b864-kube-api-access-2l2js\") pod \"c20e6570-78f4-4318-bbb4-b22d01f2b864\" (UID: \"c20e6570-78f4-4318-bbb4-b22d01f2b864\") " Apr 21 15:46:17.070582 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:17.070555 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c20e6570-78f4-4318-bbb4-b22d01f2b864-kube-api-access-2l2js" (OuterVolumeSpecName: "kube-api-access-2l2js") pod "c20e6570-78f4-4318-bbb4-b22d01f2b864" (UID: "c20e6570-78f4-4318-bbb4-b22d01f2b864"). InnerVolumeSpecName "kube-api-access-2l2js". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:46:17.169175 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:17.169148 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2l2js\" (UniqueName: \"kubernetes.io/projected/c20e6570-78f4-4318-bbb4-b22d01f2b864-kube-api-access-2l2js\") on node \"ip-10-0-132-141.ec2.internal\" DevicePath \"\"" Apr 21 15:46:17.760536 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:17.760501 2579 generic.go:358] "Generic (PLEG): container finished" podID="c20e6570-78f4-4318-bbb4-b22d01f2b864" containerID="1d087a42ef9c60697d675b69f352499833691cb66474b7f25f6719fef3edea55" exitCode=0 Apr 21 15:46:17.761030 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:17.760577 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-xrrlz" Apr 21 15:46:17.761030 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:17.760578 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-xrrlz" event={"ID":"c20e6570-78f4-4318-bbb4-b22d01f2b864","Type":"ContainerDied","Data":"1d087a42ef9c60697d675b69f352499833691cb66474b7f25f6719fef3edea55"} Apr 21 15:46:17.761030 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:17.760689 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-xrrlz" event={"ID":"c20e6570-78f4-4318-bbb4-b22d01f2b864","Type":"ContainerDied","Data":"3ba738ae55c2d7e89ef9eaae315221071b1813951381cca0f99a6221fa283ce4"} Apr 21 15:46:17.761030 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:17.760709 2579 scope.go:117] "RemoveContainer" containerID="1d087a42ef9c60697d675b69f352499833691cb66474b7f25f6719fef3edea55" Apr 21 15:46:17.768701 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:17.768683 2579 scope.go:117] "RemoveContainer" containerID="1d087a42ef9c60697d675b69f352499833691cb66474b7f25f6719fef3edea55" Apr 21 15:46:17.768990 ip-10-0-132-141 kubenswrapper[2579]: E0421 15:46:17.768966 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d087a42ef9c60697d675b69f352499833691cb66474b7f25f6719fef3edea55\": container with ID starting with 1d087a42ef9c60697d675b69f352499833691cb66474b7f25f6719fef3edea55 not found: ID does not exist" containerID="1d087a42ef9c60697d675b69f352499833691cb66474b7f25f6719fef3edea55" Apr 21 15:46:17.769092 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:17.768996 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d087a42ef9c60697d675b69f352499833691cb66474b7f25f6719fef3edea55"} err="failed to get container status \"1d087a42ef9c60697d675b69f352499833691cb66474b7f25f6719fef3edea55\": rpc error: code = NotFound desc = could not find container \"1d087a42ef9c60697d675b69f352499833691cb66474b7f25f6719fef3edea55\": container with ID starting with 1d087a42ef9c60697d675b69f352499833691cb66474b7f25f6719fef3edea55 not found: ID does not exist" Apr 21 15:46:17.780405 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:17.780381 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-xrrlz"] Apr 21 15:46:17.783087 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:17.783066 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-xrrlz"] Apr 21 15:46:19.608707 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:19.608672 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c20e6570-78f4-4318-bbb4-b22d01f2b864" path="/var/lib/kubelet/pods/c20e6570-78f4-4318-bbb4-b22d01f2b864/volumes" Apr 21 15:46:48.505312 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.505279 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl"] Apr 21 15:46:48.505766 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.505631 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c20e6570-78f4-4318-bbb4-b22d01f2b864" containerName="authorino" Apr 21 15:46:48.505766 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.505641 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c20e6570-78f4-4318-bbb4-b22d01f2b864" containerName="authorino" Apr 21 15:46:48.505766 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.505702 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c20e6570-78f4-4318-bbb4-b22d01f2b864" containerName="authorino" Apr 21 15:46:48.509234 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.509220 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.512227 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.512205 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-hvdmm\"" Apr 21 15:46:48.512491 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.512473 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 21 15:46:48.512747 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.512725 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 15:46:48.512827 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.512762 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 15:46:48.513388 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.513367 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 21 15:46:48.513503 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.513372 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 15:46:48.513503 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.513423 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 21 15:46:48.519620 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.519599 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl"] Apr 21 15:46:48.631416 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.631379 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/11240933-7d39-4fee-919b-dbec639a0b1a-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.631416 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.631425 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/11240933-7d39-4fee-919b-dbec639a0b1a-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.631637 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.631450 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/11240933-7d39-4fee-919b-dbec639a0b1a-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.631637 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.631522 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/11240933-7d39-4fee-919b-dbec639a0b1a-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.631637 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.631573 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/11240933-7d39-4fee-919b-dbec639a0b1a-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.631637 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.631606 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6cw9\" (UniqueName: \"kubernetes.io/projected/11240933-7d39-4fee-919b-dbec639a0b1a-kube-api-access-r6cw9\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.631762 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.631675 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/11240933-7d39-4fee-919b-dbec639a0b1a-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.732307 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.732269 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/11240933-7d39-4fee-919b-dbec639a0b1a-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.732481 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.732320 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6cw9\" (UniqueName: \"kubernetes.io/projected/11240933-7d39-4fee-919b-dbec639a0b1a-kube-api-access-r6cw9\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.732481 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.732387 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/11240933-7d39-4fee-919b-dbec639a0b1a-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.732481 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.732426 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/11240933-7d39-4fee-919b-dbec639a0b1a-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.732481 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.732466 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/11240933-7d39-4fee-919b-dbec639a0b1a-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.732676 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.732492 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/11240933-7d39-4fee-919b-dbec639a0b1a-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.732676 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.732523 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/11240933-7d39-4fee-919b-dbec639a0b1a-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.733132 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.733105 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/11240933-7d39-4fee-919b-dbec639a0b1a-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.734987 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.734967 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/11240933-7d39-4fee-919b-dbec639a0b1a-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.735081 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.734991 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/11240933-7d39-4fee-919b-dbec639a0b1a-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.735186 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.735165 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/11240933-7d39-4fee-919b-dbec639a0b1a-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.735223 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.735209 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/11240933-7d39-4fee-919b-dbec639a0b1a-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.740603 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.740576 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/11240933-7d39-4fee-919b-dbec639a0b1a-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.741194 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.741178 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6cw9\" (UniqueName: \"kubernetes.io/projected/11240933-7d39-4fee-919b-dbec639a0b1a-kube-api-access-r6cw9\") pod \"istiod-openshift-gateway-55ff986f96-j5dhl\" (UID: \"11240933-7d39-4fee-919b-dbec639a0b1a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.820214 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.820185 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:48.985485 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.985449 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl"] Apr 21 15:46:48.988375 ip-10-0-132-141 kubenswrapper[2579]: W0421 15:46:48.988339 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11240933_7d39_4fee_919b_dbec639a0b1a.slice/crio-bdcf1f9abd1dff3cdf4187e30327351008ef44ad4cf1150a3119c474cb643a11 WatchSource:0}: Error finding container bdcf1f9abd1dff3cdf4187e30327351008ef44ad4cf1150a3119c474cb643a11: Status 404 returned error can't find the container with id bdcf1f9abd1dff3cdf4187e30327351008ef44ad4cf1150a3119c474cb643a11 Apr 21 15:46:48.990232 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:48.990213 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:46:49.868453 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:49.868417 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" event={"ID":"11240933-7d39-4fee-919b-dbec639a0b1a","Type":"ContainerStarted","Data":"bdcf1f9abd1dff3cdf4187e30327351008ef44ad4cf1150a3119c474cb643a11"} Apr 21 15:46:51.916797 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:51.916751 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 21 15:46:51.917146 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:51.916829 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 21 15:46:52.883228 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:52.883181 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" event={"ID":"11240933-7d39-4fee-919b-dbec639a0b1a","Type":"ContainerStarted","Data":"b2b6219967a485138d52b3980dfac462ac88a3cf5e2271e366eee7220bd7ab9d"} Apr 21 15:46:52.884304 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:52.884280 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:52.885929 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:52.885905 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" Apr 21 15:46:52.919002 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:46:52.918930 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j5dhl" podStartSLOduration=1.992830353 podStartE2EDuration="4.91891228s" podCreationTimestamp="2026-04-21 15:46:48 +0000 UTC" firstStartedPulling="2026-04-21 15:46:48.990403714 +0000 UTC m=+674.018570494" lastFinishedPulling="2026-04-21 15:46:51.916485646 +0000 UTC m=+676.944652421" observedRunningTime="2026-04-21 15:46:52.915201535 +0000 UTC m=+677.943368332" watchObservedRunningTime="2026-04-21 15:46:52.91891228 +0000 UTC m=+677.947079077" Apr 21 15:50:35.538042 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:50:35.537954 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zf8zt_a1f66133-b065-4100-b368-ac1f349bf896/cluster-monitoring-operator/0.log" Apr 21 15:50:35.540668 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:50:35.540649 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/2.log" Apr 21 15:50:35.540957 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:50:35.540913 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zf8zt_a1f66133-b065-4100-b368-ac1f349bf896/cluster-monitoring-operator/0.log" Apr 21 15:50:35.543489 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:50:35.543473 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/2.log" Apr 21 15:50:35.545701 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:50:35.545681 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/0.log" Apr 21 15:50:35.548651 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:50:35.548617 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/0.log" Apr 21 15:55:35.565072 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:55:35.565040 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zf8zt_a1f66133-b065-4100-b368-ac1f349bf896/cluster-monitoring-operator/0.log" Apr 21 15:55:35.567215 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:55:35.567193 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/2.log" Apr 21 15:55:35.569088 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:55:35.569071 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zf8zt_a1f66133-b065-4100-b368-ac1f349bf896/cluster-monitoring-operator/0.log" Apr 21 15:55:35.571306 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:55:35.571287 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/2.log" Apr 21 15:55:35.571468 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:55:35.571453 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/0.log" Apr 21 15:55:35.575768 ip-10-0-132-141 kubenswrapper[2579]: I0421 15:55:35.575754 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/0.log" Apr 21 16:00:35.590445 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:00:35.590419 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zf8zt_a1f66133-b065-4100-b368-ac1f349bf896/cluster-monitoring-operator/0.log" Apr 21 16:00:35.592874 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:00:35.592852 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/2.log" Apr 21 16:00:35.596067 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:00:35.596046 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zf8zt_a1f66133-b065-4100-b368-ac1f349bf896/cluster-monitoring-operator/0.log" Apr 21 16:00:35.597575 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:00:35.597558 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/0.log" Apr 21 16:00:35.598331 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:00:35.598311 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/2.log" Apr 21 16:00:35.602968 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:00:35.602934 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/0.log" Apr 21 16:05:35.618565 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:05:35.618537 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zf8zt_a1f66133-b065-4100-b368-ac1f349bf896/cluster-monitoring-operator/0.log" Apr 21 16:05:35.620885 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:05:35.620860 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/2.log" Apr 21 16:05:35.624357 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:05:35.624333 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zf8zt_a1f66133-b065-4100-b368-ac1f349bf896/cluster-monitoring-operator/0.log" Apr 21 16:05:35.625799 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:05:35.625768 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/0.log" Apr 21 16:05:35.627043 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:05:35.627008 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/2.log" Apr 21 16:05:35.634540 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:05:35.634522 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/0.log" Apr 21 16:05:58.380576 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:05:58.380541 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-j5dhl_11240933-7d39-4fee-919b-dbec639a0b1a/discovery/0.log" Apr 21 16:05:59.258960 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:05:59.258915 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-j5dhl_11240933-7d39-4fee-919b-dbec639a0b1a/discovery/0.log" Apr 21 16:06:00.133639 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:00.133606 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-92f7f_d0892fa6-d3bc-4b17-a57e-bade8ade4cf1/manager/0.log" Apr 21 16:06:00.203325 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:00.203292 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-9sfqk_6eedf922-1485-472a-b11d-60d6d59b53e3/manager/0.log" Apr 21 16:06:05.696121 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:05.696081 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-94vp4_f45fe3a8-4ce7-49d1-8a7a-ca68d2f07acd/global-pull-secret-syncer/0.log" Apr 21 16:06:05.773176 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:05.773147 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-crzt4_a7482d21-f4ae-4fe1-a26f-d1de8cd73926/konnectivity-agent/0.log" Apr 21 16:06:05.849229 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:05.849198 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-141.ec2.internal_546d5bb4e1c933cd7732e2f27360ece8/haproxy/0.log" Apr 21 16:06:09.745747 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:09.745715 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-92f7f_d0892fa6-d3bc-4b17-a57e-bade8ade4cf1/manager/0.log" Apr 21 16:06:09.850856 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:09.850826 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-9sfqk_6eedf922-1485-472a-b11d-60d6d59b53e3/manager/0.log" Apr 21 16:06:10.901227 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:10.901200 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc/alertmanager/0.log" Apr 21 16:06:10.923143 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:10.923112 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc/config-reloader/0.log" Apr 21 16:06:10.945103 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:10.945076 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc/kube-rbac-proxy-web/0.log" Apr 21 16:06:10.968882 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:10.968856 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc/kube-rbac-proxy/0.log" Apr 21 16:06:10.995194 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:10.995161 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc/kube-rbac-proxy-metric/0.log" Apr 21 16:06:11.024860 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.024828 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc/prom-label-proxy/0.log" Apr 21 16:06:11.047492 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.047466 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4dd5aa1d-2fcd-4abc-a424-30e1fe11d4cc/init-config-reloader/0.log" Apr 21 16:06:11.089769 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.089730 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zf8zt_a1f66133-b065-4100-b368-ac1f349bf896/cluster-monitoring-operator/1.log" Apr 21 16:06:11.257741 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.257661 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zf8zt_a1f66133-b065-4100-b368-ac1f349bf896/cluster-monitoring-operator/0.log" Apr 21 16:06:11.286080 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.286042 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-6xnbr_3cb44a5b-124e-4afc-a061-3f8c0e6474db/kube-state-metrics/0.log" Apr 21 16:06:11.305446 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.305419 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-6xnbr_3cb44a5b-124e-4afc-a061-3f8c0e6474db/kube-rbac-proxy-main/0.log" Apr 21 16:06:11.328845 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.328819 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-6xnbr_3cb44a5b-124e-4afc-a061-3f8c0e6474db/kube-rbac-proxy-self/0.log" Apr 21 16:06:11.353444 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.353416 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-945fb46b9-n7wkp_c7328b78-bba6-45e4-a754-8ac5f6456e78/metrics-server/0.log" Apr 21 16:06:11.378497 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.378471 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-jj9dg_16063ffa-ebfd-4360-88c3-de39e192699f/monitoring-plugin/0.log" Apr 21 16:06:11.409753 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.409715 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-65cbt_928583e4-43ad-4abd-acc5-eb09b449e3b7/node-exporter/0.log" Apr 21 16:06:11.438083 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.438055 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-65cbt_928583e4-43ad-4abd-acc5-eb09b449e3b7/kube-rbac-proxy/0.log" Apr 21 16:06:11.460524 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.460500 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-65cbt_928583e4-43ad-4abd-acc5-eb09b449e3b7/init-textfile/0.log" Apr 21 16:06:11.773635 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.773606 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fc2492e9-fd61-4d7d-ab06-3411dd90d927/prometheus/0.log" Apr 21 16:06:11.794608 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.794582 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fc2492e9-fd61-4d7d-ab06-3411dd90d927/config-reloader/0.log" Apr 21 16:06:11.821112 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.821086 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fc2492e9-fd61-4d7d-ab06-3411dd90d927/thanos-sidecar/0.log" Apr 21 16:06:11.849168 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.849137 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fc2492e9-fd61-4d7d-ab06-3411dd90d927/kube-rbac-proxy-web/0.log" Apr 21 16:06:11.880673 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.880642 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fc2492e9-fd61-4d7d-ab06-3411dd90d927/kube-rbac-proxy/0.log" Apr 21 16:06:11.903814 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.903789 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fc2492e9-fd61-4d7d-ab06-3411dd90d927/kube-rbac-proxy-thanos/0.log" Apr 21 16:06:11.925975 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:11.925925 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fc2492e9-fd61-4d7d-ab06-3411dd90d927/init-config-reloader/0.log" Apr 21 16:06:12.039269 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:12.039183 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-64cc55cd6f-7hkkq_92ad2768-ab1b-4047-9499-92f73e6e7306/telemeter-client/0.log" Apr 21 16:06:12.059034 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:12.059002 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-64cc55cd6f-7hkkq_92ad2768-ab1b-4047-9499-92f73e6e7306/reload/0.log" Apr 21 16:06:12.079603 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:12.079564 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-64cc55cd6f-7hkkq_92ad2768-ab1b-4047-9499-92f73e6e7306/kube-rbac-proxy/0.log" Apr 21 16:06:12.129467 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:12.129436 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dddb456dc-cz6f5_de1019eb-eae5-4deb-bb4c-327d43db9d13/thanos-query/0.log" Apr 21 16:06:12.151735 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:12.151714 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dddb456dc-cz6f5_de1019eb-eae5-4deb-bb4c-327d43db9d13/kube-rbac-proxy-web/0.log" Apr 21 16:06:12.175740 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:12.175716 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dddb456dc-cz6f5_de1019eb-eae5-4deb-bb4c-327d43db9d13/kube-rbac-proxy/0.log" Apr 21 16:06:12.196246 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:12.196224 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dddb456dc-cz6f5_de1019eb-eae5-4deb-bb4c-327d43db9d13/prom-label-proxy/0.log" Apr 21 16:06:12.221701 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:12.221675 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dddb456dc-cz6f5_de1019eb-eae5-4deb-bb4c-327d43db9d13/kube-rbac-proxy-rules/0.log" Apr 21 16:06:12.243746 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:12.243722 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dddb456dc-cz6f5_de1019eb-eae5-4deb-bb4c-327d43db9d13/kube-rbac-proxy-metrics/0.log" Apr 21 16:06:13.885131 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:13.885102 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/2.log" Apr 21 16:06:13.893344 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:13.893319 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kwhwh_60c10748-f987-4f76-8f57-6a42bf9f4321/console-operator/3.log" Apr 21 16:06:14.552677 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.552641 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5"] Apr 21 16:06:14.556263 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.556241 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:14.560133 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.560112 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-dt7xd\"/\"default-dockercfg-k8dzq\"" Apr 21 16:06:14.560254 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.560237 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dt7xd\"/\"kube-root-ca.crt\"" Apr 21 16:06:14.561121 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.561106 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dt7xd\"/\"openshift-service-ca.crt\"" Apr 21 16:06:14.571066 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.571045 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5"] Apr 21 16:06:14.616979 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.616929 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7ba97e8b-fd5d-460d-ae88-a29c6c6b2511-lib-modules\") pod \"perf-node-gather-daemonset-f5lx5\" (UID: \"7ba97e8b-fd5d-460d-ae88-a29c6c6b2511\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:14.616979 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.616977 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8pg9\" (UniqueName: \"kubernetes.io/projected/7ba97e8b-fd5d-460d-ae88-a29c6c6b2511-kube-api-access-m8pg9\") pod \"perf-node-gather-daemonset-f5lx5\" (UID: \"7ba97e8b-fd5d-460d-ae88-a29c6c6b2511\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:14.617216 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.617118 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7ba97e8b-fd5d-460d-ae88-a29c6c6b2511-podres\") pod \"perf-node-gather-daemonset-f5lx5\" (UID: \"7ba97e8b-fd5d-460d-ae88-a29c6c6b2511\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:14.617216 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.617186 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ba97e8b-fd5d-460d-ae88-a29c6c6b2511-sys\") pod \"perf-node-gather-daemonset-f5lx5\" (UID: \"7ba97e8b-fd5d-460d-ae88-a29c6c6b2511\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:14.617305 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.617253 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7ba97e8b-fd5d-460d-ae88-a29c6c6b2511-proc\") pod \"perf-node-gather-daemonset-f5lx5\" (UID: \"7ba97e8b-fd5d-460d-ae88-a29c6c6b2511\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:14.717725 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.717689 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7ba97e8b-fd5d-460d-ae88-a29c6c6b2511-podres\") pod \"perf-node-gather-daemonset-f5lx5\" (UID: \"7ba97e8b-fd5d-460d-ae88-a29c6c6b2511\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:14.717932 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.717735 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ba97e8b-fd5d-460d-ae88-a29c6c6b2511-sys\") pod \"perf-node-gather-daemonset-f5lx5\" (UID: \"7ba97e8b-fd5d-460d-ae88-a29c6c6b2511\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:14.717932 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.717783 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7ba97e8b-fd5d-460d-ae88-a29c6c6b2511-proc\") pod \"perf-node-gather-daemonset-f5lx5\" (UID: \"7ba97e8b-fd5d-460d-ae88-a29c6c6b2511\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:14.717932 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.717846 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7ba97e8b-fd5d-460d-ae88-a29c6c6b2511-lib-modules\") pod \"perf-node-gather-daemonset-f5lx5\" (UID: \"7ba97e8b-fd5d-460d-ae88-a29c6c6b2511\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:14.717932 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.717855 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7ba97e8b-fd5d-460d-ae88-a29c6c6b2511-podres\") pod \"perf-node-gather-daemonset-f5lx5\" (UID: \"7ba97e8b-fd5d-460d-ae88-a29c6c6b2511\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:14.717932 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.717869 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8pg9\" (UniqueName: \"kubernetes.io/projected/7ba97e8b-fd5d-460d-ae88-a29c6c6b2511-kube-api-access-m8pg9\") pod \"perf-node-gather-daemonset-f5lx5\" (UID: \"7ba97e8b-fd5d-460d-ae88-a29c6c6b2511\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:14.717932 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.717870 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ba97e8b-fd5d-460d-ae88-a29c6c6b2511-sys\") pod \"perf-node-gather-daemonset-f5lx5\" (UID: \"7ba97e8b-fd5d-460d-ae88-a29c6c6b2511\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:14.717932 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.717920 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7ba97e8b-fd5d-460d-ae88-a29c6c6b2511-proc\") pod \"perf-node-gather-daemonset-f5lx5\" (UID: \"7ba97e8b-fd5d-460d-ae88-a29c6c6b2511\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:14.718341 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.717974 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7ba97e8b-fd5d-460d-ae88-a29c6c6b2511-lib-modules\") pod \"perf-node-gather-daemonset-f5lx5\" (UID: \"7ba97e8b-fd5d-460d-ae88-a29c6c6b2511\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:14.726092 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.726073 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8pg9\" (UniqueName: \"kubernetes.io/projected/7ba97e8b-fd5d-460d-ae88-a29c6c6b2511-kube-api-access-m8pg9\") pod \"perf-node-gather-daemonset-f5lx5\" (UID: \"7ba97e8b-fd5d-460d-ae88-a29c6c6b2511\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:14.823983 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.823952 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-npp9r_0bb5aac3-8258-4d28-bb86-c01cb26966b1/volume-data-source-validator/0.log" Apr 21 16:06:14.866563 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.866533 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:14.995914 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:14.995879 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5"] Apr 21 16:06:15.000062 ip-10-0-132-141 kubenswrapper[2579]: W0421 16:06:15.000032 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7ba97e8b_fd5d_460d_ae88_a29c6c6b2511.slice/crio-16c5405c40490b90244e41d0c020f353fda5d9c4a0f5ee391c5b0e32e7ffb596 WatchSource:0}: Error finding container 16c5405c40490b90244e41d0c020f353fda5d9c4a0f5ee391c5b0e32e7ffb596: Status 404 returned error can't find the container with id 16c5405c40490b90244e41d0c020f353fda5d9c4a0f5ee391c5b0e32e7ffb596 Apr 21 16:06:15.001668 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:15.001649 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 16:06:15.528134 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:15.528036 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dgpm6_704d38f9-6323-48bf-b8f7-977c83275b82/dns/0.log" Apr 21 16:06:15.550903 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:15.550870 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dgpm6_704d38f9-6323-48bf-b8f7-977c83275b82/kube-rbac-proxy/0.log" Apr 21 16:06:15.671245 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:15.671211 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2xzlk_a396e4e6-5a05-450a-8a8c-263dd9674c34/dns-node-resolver/0.log" Apr 21 16:06:15.912398 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:15.912362 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" event={"ID":"7ba97e8b-fd5d-460d-ae88-a29c6c6b2511","Type":"ContainerStarted","Data":"f50bdb6ba4b3448793ca496a06dd169f9784c3d3912d2d4087622053e3d17640"} Apr 21 16:06:15.912398 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:15.912403 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" event={"ID":"7ba97e8b-fd5d-460d-ae88-a29c6c6b2511","Type":"ContainerStarted","Data":"16c5405c40490b90244e41d0c020f353fda5d9c4a0f5ee391c5b0e32e7ffb596"} Apr 21 16:06:15.912679 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:15.912440 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:15.929746 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:15.929698 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" podStartSLOduration=1.929679406 podStartE2EDuration="1.929679406s" podCreationTimestamp="2026-04-21 16:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:06:15.928587975 +0000 UTC m=+1840.956754772" watchObservedRunningTime="2026-04-21 16:06:15.929679406 +0000 UTC m=+1840.957846251" Apr 21 16:06:16.202317 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:16.202230 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fj5g2_657382f2-3c88-4d85-b5cf-5533d6e4b19e/node-ca/0.log" Apr 21 16:06:17.135421 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:17.135383 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-j5dhl_11240933-7d39-4fee-919b-dbec639a0b1a/discovery/0.log" Apr 21 16:06:17.671352 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:17.671308 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-klb4g_dff5e891-a3c3-4526-94e0-f1c91d517e9d/serve-healthcheck-canary/0.log" Apr 21 16:06:18.170262 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:18.170211 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-48m9g_ff5f3b20-ad56-4139-ac9f-e02877e78f48/kube-rbac-proxy/0.log" Apr 21 16:06:18.192234 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:18.192204 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-48m9g_ff5f3b20-ad56-4139-ac9f-e02877e78f48/exporter/0.log" Apr 21 16:06:18.215285 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:18.215141 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-48m9g_ff5f3b20-ad56-4139-ac9f-e02877e78f48/extractor/0.log" Apr 21 16:06:20.844031 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:20.844000 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-fb974466f-775wj_0b0f5d96-5050-40ae-bf1c-5f10356839c7/manager/0.log" Apr 21 16:06:20.864990 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:20.864958 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-88zgd_a5323695-5014-4f0d-923c-aca1fcce7dd1/openshift-lws-operator/0.log" Apr 21 16:06:21.925832 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:21.925799 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-f5lx5" Apr 21 16:06:26.266785 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:26.266753 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-l82qb_d9332cef-c45c-4842-b3bb-c9aa72e9fbf5/migrator/0.log" Apr 21 16:06:26.285169 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:26.285140 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-l82qb_d9332cef-c45c-4842-b3bb-c9aa72e9fbf5/graceful-termination/0.log" Apr 21 16:06:27.746625 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:27.746592 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mz26j_b151d377-fb3e-44d5-a5e1-57bb572347d7/kube-multus-additional-cni-plugins/0.log" Apr 21 16:06:27.768368 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:27.768338 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mz26j_b151d377-fb3e-44d5-a5e1-57bb572347d7/egress-router-binary-copy/0.log" Apr 21 16:06:27.790114 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:27.790088 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mz26j_b151d377-fb3e-44d5-a5e1-57bb572347d7/cni-plugins/0.log" Apr 21 16:06:27.813900 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:27.813874 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mz26j_b151d377-fb3e-44d5-a5e1-57bb572347d7/bond-cni-plugin/0.log" Apr 21 16:06:27.834741 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:27.834714 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mz26j_b151d377-fb3e-44d5-a5e1-57bb572347d7/routeoverride-cni/0.log" Apr 21 16:06:27.855128 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:27.855110 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mz26j_b151d377-fb3e-44d5-a5e1-57bb572347d7/whereabouts-cni-bincopy/0.log" Apr 21 16:06:27.875018 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:27.874996 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mz26j_b151d377-fb3e-44d5-a5e1-57bb572347d7/whereabouts-cni/0.log" Apr 21 16:06:28.165043 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:28.165010 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tcxzq_49b893d4-ef45-4e0b-9df7-4cda6b46fd3d/kube-multus/0.log" Apr 21 16:06:28.286371 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:28.286316 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x5zkt_088bc7e8-4515-4c77-967b-a70ef32cd85e/network-metrics-daemon/0.log" Apr 21 16:06:28.306561 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:28.306537 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x5zkt_088bc7e8-4515-4c77-967b-a70ef32cd85e/kube-rbac-proxy/0.log" Apr 21 16:06:29.119994 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:29.119935 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-controller/0.log" Apr 21 16:06:29.135458 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:29.135428 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/0.log" Apr 21 16:06:29.152177 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:29.152136 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovn-acl-logging/1.log" Apr 21 16:06:29.176439 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:29.176416 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/kube-rbac-proxy-node/0.log" Apr 21 16:06:29.196530 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:29.196501 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 16:06:29.214309 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:29.214287 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/northd/0.log" Apr 21 16:06:29.233321 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:29.233302 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/nbdb/0.log" Apr 21 16:06:29.254303 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:29.254285 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/sbdb/0.log" Apr 21 16:06:29.418113 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:29.418015 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7tbgn_df61a406-9ba7-4b2d-94b8-03e6d97a8118/ovnkube-controller/0.log" Apr 21 16:06:31.188118 ip-10-0-132-141 kubenswrapper[2579]: I0421 16:06:31.188087 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zmzmd_f9899d71-8c55-4ec8-929c-ab8f3dcf09e9/network-check-target-container/0.log"