Apr 23 13:29:40.126184 ip-10-0-129-103 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 13:29:40.126196 ip-10-0-129-103 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 13:29:40.126206 ip-10-0-129-103 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 13:29:40.126508 ip-10-0-129-103 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 13:29:51.447208 ip-10-0-129-103 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 13:29:51.447225 ip-10-0-129-103 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 94dc91b186ad43b886dc16e3994c3423 -- Apr 23 13:32:07.945913 ip-10-0-129-103 systemd[1]: Starting Kubernetes Kubelet... Apr 23 13:32:08.359703 ip-10-0-129-103 kubenswrapper[2581]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:32:08.359703 ip-10-0-129-103 kubenswrapper[2581]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 13:32:08.359703 ip-10-0-129-103 kubenswrapper[2581]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:32:08.359703 ip-10-0-129-103 kubenswrapper[2581]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 13:32:08.359703 ip-10-0-129-103 kubenswrapper[2581]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:32:08.361095 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.361018 2581 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 13:32:08.365337 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365324 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:08.365337 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365338 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365342 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365345 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365348 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365351 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365354 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365357 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365360 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365362 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365365 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365368 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365370 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365373 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365377 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365381 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365384 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365388 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365391 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365394 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:08.365406 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365396 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365399 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365402 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365406 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365409 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365412 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365415 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365418 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365421 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365424 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365426 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365429 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365431 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365434 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365436 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365438 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365441 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365443 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365445 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365448 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:08.365890 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365450 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365453 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365455 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365458 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365460 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365462 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365465 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365467 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365469 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365472 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365474 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365476 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365479 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365482 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365485 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365487 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365490 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365493 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365495 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:08.366381 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365499 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365501 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365504 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365507 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365509 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365511 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365514 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365516 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365519 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365521 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365536 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365539 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365542 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365544 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365547 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365549 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365552 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365555 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365557 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365560 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:08.366885 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365562 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365565 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365567 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365570 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365572 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365574 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365578 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365952 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365957 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365960 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365963 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365966 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365969 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365971 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365974 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365976 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365979 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365981 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365984 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365986 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:08.367360 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365989 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365993 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365996 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.365999 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366001 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366004 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366006 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366009 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366011 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366013 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366016 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366018 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366021 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366023 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366026 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366029 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366031 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366033 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366036 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:08.367946 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366039 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366041 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366044 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366046 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366049 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366051 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366054 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366056 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366058 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366061 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366063 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366066 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366068 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366070 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366073 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366075 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366077 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366080 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366082 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366085 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:08.368419 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366088 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366090 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366092 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366095 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366097 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366100 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366102 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366105 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366107 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366109 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366112 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366114 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366117 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366119 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366122 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366124 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366127 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366129 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366132 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366135 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:08.368940 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366137 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366139 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366143 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366145 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366148 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366150 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366152 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366155 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366157 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366160 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366162 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366166 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366170 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366173 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366257 2581 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366264 2581 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366270 2581 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366276 2581 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366280 2581 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366283 2581 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366288 2581 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 13:32:08.369422 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366292 2581 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366295 2581 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366298 2581 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366302 2581 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366305 2581 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366308 2581 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366311 2581 flags.go:64] FLAG: --cgroup-root="" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366314 2581 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366317 2581 flags.go:64] FLAG: --client-ca-file="" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366319 2581 flags.go:64] FLAG: --cloud-config="" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366322 2581 flags.go:64] FLAG: --cloud-provider="external" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366326 2581 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366330 2581 flags.go:64] FLAG: --cluster-domain="" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366333 2581 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366336 2581 flags.go:64] FLAG: --config-dir="" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366338 2581 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366342 2581 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366345 2581 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366349 2581 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366352 2581 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366355 2581 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366358 2581 flags.go:64] FLAG: --contention-profiling="false" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366361 2581 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366363 2581 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366366 2581 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 13:32:08.369950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366369 2581 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366374 2581 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366377 2581 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366380 2581 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366383 2581 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366386 2581 flags.go:64] FLAG: --enable-server="true" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366389 2581 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366394 2581 flags.go:64] FLAG: --event-burst="100" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366397 2581 flags.go:64] FLAG: --event-qps="50" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366400 2581 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366403 2581 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366406 2581 flags.go:64] FLAG: --eviction-hard="" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366410 2581 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366413 2581 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366416 2581 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366418 2581 flags.go:64] FLAG: --eviction-soft="" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366421 2581 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366424 2581 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366427 2581 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366430 2581 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366433 2581 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366436 2581 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366438 2581 flags.go:64] FLAG: --feature-gates="" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366442 2581 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366445 2581 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 13:32:08.370590 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366447 2581 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366450 2581 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366453 2581 flags.go:64] FLAG: --healthz-port="10248" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366456 2581 flags.go:64] FLAG: --help="false" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366459 2581 flags.go:64] FLAG: --hostname-override="ip-10-0-129-103.ec2.internal" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366462 2581 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366464 2581 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366467 2581 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366470 2581 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366474 2581 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366477 2581 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366480 2581 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366483 2581 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366486 2581 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366493 2581 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366496 2581 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366498 2581 flags.go:64] FLAG: --kube-reserved="" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366502 2581 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366505 2581 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366508 2581 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366511 2581 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366514 2581 flags.go:64] FLAG: --lock-file="" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366517 2581 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366520 2581 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 13:32:08.371195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366536 2581 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366542 2581 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366545 2581 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366548 2581 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366551 2581 flags.go:64] FLAG: --logging-format="text" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366554 2581 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366557 2581 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366560 2581 flags.go:64] FLAG: --manifest-url="" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366563 2581 flags.go:64] FLAG: --manifest-url-header="" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366567 2581 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366570 2581 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366574 2581 flags.go:64] FLAG: --max-pods="110" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366577 2581 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366580 2581 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366582 2581 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366585 2581 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366588 2581 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366591 2581 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366594 2581 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366601 2581 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366604 2581 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366607 2581 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366612 2581 flags.go:64] FLAG: --pod-cidr="" Apr 23 13:32:08.371819 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366614 2581 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366620 2581 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366623 2581 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366626 2581 flags.go:64] FLAG: --pods-per-core="0" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366628 2581 flags.go:64] FLAG: --port="10250" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366631 2581 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366634 2581 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b82224021299fb40" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366637 2581 flags.go:64] FLAG: --qos-reserved="" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366640 2581 flags.go:64] FLAG: --read-only-port="10255" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366643 2581 flags.go:64] FLAG: --register-node="true" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366646 2581 flags.go:64] FLAG: --register-schedulable="true" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366649 2581 flags.go:64] FLAG: --register-with-taints="" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366652 2581 flags.go:64] FLAG: --registry-burst="10" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366655 2581 flags.go:64] FLAG: --registry-qps="5" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366658 2581 flags.go:64] FLAG: --reserved-cpus="" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366661 2581 flags.go:64] FLAG: --reserved-memory="" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366665 2581 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366668 2581 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366670 2581 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366673 2581 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366676 2581 flags.go:64] FLAG: --runonce="false" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366679 2581 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366681 2581 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366684 2581 flags.go:64] FLAG: --seccomp-default="false" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366687 2581 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366690 2581 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 13:32:08.372366 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366695 2581 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366698 2581 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366701 2581 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366704 2581 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366706 2581 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366709 2581 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366713 2581 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366717 2581 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366720 2581 flags.go:64] FLAG: --system-cgroups="" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366722 2581 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366728 2581 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366730 2581 flags.go:64] FLAG: --tls-cert-file="" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366733 2581 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366737 2581 flags.go:64] FLAG: --tls-min-version="" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366740 2581 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366742 2581 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366745 2581 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366748 2581 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366751 2581 flags.go:64] FLAG: --v="2" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366755 2581 flags.go:64] FLAG: --version="false" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366759 2581 flags.go:64] FLAG: --vmodule="" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366763 2581 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.366767 2581 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366854 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:08.373023 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366858 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366860 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366863 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366866 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366868 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366871 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366873 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366876 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366883 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366885 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366888 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366890 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366894 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366898 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366901 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366904 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366906 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366909 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366912 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:08.373631 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366914 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366917 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366919 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366922 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366924 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366927 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366944 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366949 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366952 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366955 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366958 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366962 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366965 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366968 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366971 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366973 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366976 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366978 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366981 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:08.374116 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366983 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366986 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366989 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366992 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366995 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366997 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.366999 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367002 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367006 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367008 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367011 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367013 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367016 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367018 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367021 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367023 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367026 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367028 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367031 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367033 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:08.374616 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367036 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367038 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367041 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367043 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367047 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367049 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367052 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367054 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367057 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367059 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367062 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367064 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367067 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367069 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367072 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367075 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367077 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367080 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367082 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367084 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:08.375096 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367088 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:08.375629 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367091 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:08.375629 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367093 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:08.375629 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367095 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:08.375629 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367098 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:08.375629 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367101 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:08.375629 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.367103 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:08.375629 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.367678 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:32:08.375629 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.375518 2581 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 13:32:08.375629 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.375556 2581 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 13:32:08.375629 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375606 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:08.375629 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375612 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:08.375629 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375615 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:08.375629 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375618 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:08.375629 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375621 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:08.375629 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375624 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375627 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375630 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375633 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375636 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375639 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375642 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375645 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375647 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375650 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375653 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375655 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375658 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375660 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375663 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375665 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375668 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375670 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375672 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375675 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:08.376080 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375677 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375680 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375684 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375688 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375691 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375694 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375697 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375700 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375703 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375707 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375710 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375713 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375716 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375718 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375722 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375724 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375727 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375730 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375732 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:08.376858 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375735 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375738 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375740 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375743 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375745 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375748 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375750 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375752 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375755 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375757 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375760 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375763 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375765 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375768 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375770 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375773 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375775 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375778 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375780 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375783 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:08.377365 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375785 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375788 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375791 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375793 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375796 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375798 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375801 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375804 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375807 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375809 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375812 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375814 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375817 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375819 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375822 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375824 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375827 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375829 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375832 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375834 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:08.377874 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375837 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:08.378370 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.375839 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:08.378370 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.375844 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:32:08.378370 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376325 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:08.378370 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376342 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:08.378370 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376354 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:08.378370 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376360 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:08.378370 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376364 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:08.378370 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376371 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:08.378370 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376398 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:08.378370 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376403 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:08.378370 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376409 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:08.378370 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376413 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:08.378370 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376418 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:08.378370 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376423 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:08.378370 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376427 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376432 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376442 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376446 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376451 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376455 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376459 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376464 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376468 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376472 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376476 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376480 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376484 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376488 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376492 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376501 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376505 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376509 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376513 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:08.378759 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376517 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376521 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376544 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376549 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376553 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376557 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376561 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376565 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376569 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376578 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376581 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376585 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376589 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376594 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376598 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376602 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376606 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376610 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376614 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376619 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:08.379251 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376623 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376632 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376636 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376640 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376644 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376647 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376651 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376655 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376659 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376664 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376668 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376674 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376681 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376686 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376696 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376701 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376706 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376710 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376714 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376718 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:08.379745 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376722 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:08.380231 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376726 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:08.380231 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376730 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:08.380231 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376734 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:08.380231 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376737 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:08.380231 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376742 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:08.380231 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376751 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:08.380231 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376755 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:08.380231 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376759 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:08.380231 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376763 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:08.380231 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376767 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:08.380231 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376771 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:08.380231 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376775 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:08.380231 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376780 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:08.380231 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:08.376783 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:08.380231 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.376791 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:32:08.380231 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.377692 2581 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 13:32:08.380639 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.379608 2581 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 13:32:08.380639 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.380460 2581 server.go:1019] "Starting client certificate rotation" Apr 23 13:32:08.380639 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.380555 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:32:08.381369 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.381356 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:32:08.404664 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.404643 2581 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:32:08.408327 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.408310 2581 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:32:08.420596 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.420458 2581 log.go:25] "Validated CRI v1 runtime API" Apr 23 13:32:08.425951 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.425938 2581 log.go:25] "Validated CRI v1 image API" Apr 23 13:32:08.427760 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.427746 2581 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 13:32:08.431730 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.431710 2581 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 c13b5444-48e7-47d9-8973-ecac0c436dab:/dev/nvme0n1p4 f76ccb29-1c3b-45f5-98c6-6a9e0dcbdc86:/dev/nvme0n1p3] Apr 23 13:32:08.431793 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.431730 2581 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 13:32:08.439088 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.438770 2581 manager.go:217] Machine: {Timestamp:2026-04-23 13:32:08.436849523 +0000 UTC m=+0.377023976 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3119951 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24e92e91d990b93249512310934b5c SystemUUID:ec24e92e-91d9-90b9-3249-512310934b5c BootID:94dc91b1-86ad-43b8-86dc-16e3994c3423 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:32:f6:ac:98:47 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:32:f6:ac:98:47 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a2:5a:ed:09:ef:09 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 13:32:08.439088 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.439080 2581 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 13:32:08.439249 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.439189 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:32:08.439249 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.439193 2581 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 13:32:08.440270 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.440246 2581 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 13:32:08.440423 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.440271 2581 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-103.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 13:32:08.440506 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.440434 2581 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 13:32:08.440506 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.440447 2581 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 13:32:08.440506 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.440465 2581 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:32:08.441302 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.441290 2581 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:32:08.442093 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.442080 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:32:08.442228 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.442217 2581 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 13:32:08.445428 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.445416 2581 kubelet.go:491] "Attempting to sync node with API server" Apr 23 13:32:08.445498 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.445435 2581 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 13:32:08.445498 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.445457 2581 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 13:32:08.445498 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.445470 2581 kubelet.go:397] "Adding apiserver pod source" Apr 23 13:32:08.445498 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.445482 2581 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 13:32:08.446633 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.446619 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:32:08.446700 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.446642 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:32:08.449428 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.449406 2581 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 13:32:08.450701 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.450684 2581 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 13:32:08.452475 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.452367 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 13:32:08.452475 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.452385 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 13:32:08.452475 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.452392 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 13:32:08.452475 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.452409 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 13:32:08.452475 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.452417 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 13:32:08.452475 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.452423 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 13:32:08.452475 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.452429 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 13:32:08.452475 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.452434 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 13:32:08.452475 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.452442 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 13:32:08.452475 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.452448 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 13:32:08.452475 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.452456 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 13:32:08.452475 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.452465 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 13:32:08.454160 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.454148 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 13:32:08.454160 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.454160 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 13:32:08.457662 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.457648 2581 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 13:32:08.457747 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.457697 2581 server.go:1295] "Started kubelet" Apr 23 13:32:08.457747 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.457704 2581 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-103.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 13:32:08.457821 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.457770 2581 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 13:32:08.457853 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.457802 2581 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 13:32:08.457880 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.457869 2581 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 13:32:08.458412 ip-10-0-129-103 systemd[1]: Started Kubernetes Kubelet. Apr 23 13:32:08.458890 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.458694 2581 server.go:317] "Adding debug handlers to kubelet server" Apr 23 13:32:08.458890 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.458703 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-103.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 13:32:08.458890 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.458703 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 13:32:08.458890 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.458841 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kpn4t" Apr 23 13:32:08.459285 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.459186 2581 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 13:32:08.463496 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.463477 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 13:32:08.463977 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.463959 2581 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 13:32:08.464858 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.464828 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kpn4t" Apr 23 13:32:08.465024 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.464872 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-103.ec2.internal\" not found" Apr 23 13:32:08.465095 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.464980 2581 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 13:32:08.465095 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.464982 2581 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 13:32:08.465095 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.465053 2581 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 13:32:08.465095 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.465009 2581 factory.go:55] Registering systemd factory Apr 23 13:32:08.465388 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.465106 2581 factory.go:223] Registration of the systemd container factory successfully Apr 23 13:32:08.465388 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.465167 2581 reconstruct.go:97] "Volume reconstruction finished" Apr 23 13:32:08.465388 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.465177 2581 reconciler.go:26] "Reconciler: start to sync state" Apr 23 13:32:08.465388 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.465288 2581 factory.go:153] Registering CRI-O factory Apr 23 13:32:08.465388 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.465302 2581 factory.go:223] Registration of the crio container factory successfully Apr 23 13:32:08.465388 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.465364 2581 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 13:32:08.465388 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.465388 2581 factory.go:103] Registering Raw factory Apr 23 13:32:08.465707 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.465402 2581 manager.go:1196] Started watching for new ooms in manager Apr 23 13:32:08.466826 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.466467 2581 manager.go:319] Starting recovery of all containers Apr 23 13:32:08.468297 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.468200 2581 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-103.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 13:32:08.468372 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.468327 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 13:32:08.472212 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.468248 2581 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-103.ec2.internal.18a8ffa2faf2c563 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-103.ec2.internal,UID:ip-10-0-129-103.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-103.ec2.internal,},FirstTimestamp:2026-04-23 13:32:08.457659747 +0000 UTC m=+0.397834198,LastTimestamp:2026-04-23 13:32:08.457659747 +0000 UTC m=+0.397834198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-103.ec2.internal,}" Apr 23 13:32:08.472556 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.472483 2581 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 13:32:08.483235 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.483218 2581 manager.go:324] Recovery completed Apr 23 13:32:08.487396 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.487383 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:08.489610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.489594 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:08.489677 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.489623 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:08.489677 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.489633 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:08.490104 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.490090 2581 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 13:32:08.490104 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.490102 2581 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 13:32:08.490219 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.490117 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:32:08.491798 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.491733 2581 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-103.ec2.internal.18a8ffa2fcda4a8c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-103.ec2.internal,UID:ip-10-0-129-103.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-103.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-103.ec2.internal,},FirstTimestamp:2026-04-23 13:32:08.489609868 +0000 UTC m=+0.429784319,LastTimestamp:2026-04-23 13:32:08.489609868 +0000 UTC m=+0.429784319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-103.ec2.internal,}" Apr 23 13:32:08.492192 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.492176 2581 policy_none.go:49] "None policy: Start" Apr 23 13:32:08.492192 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.492194 2581 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 13:32:08.492303 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.492204 2581 state_mem.go:35] "Initializing new in-memory state store" Apr 23 13:32:08.528189 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.528173 2581 manager.go:341] "Starting Device Plugin manager" Apr 23 13:32:08.540682 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.528216 2581 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 13:32:08.540682 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.528229 2581 server.go:85] "Starting device plugin registration server" Apr 23 13:32:08.540682 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.528444 2581 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 13:32:08.540682 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.528455 2581 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 13:32:08.540682 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.528558 2581 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 13:32:08.540682 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.528652 2581 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 13:32:08.540682 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.528665 2581 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 13:32:08.540682 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.529269 2581 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 13:32:08.540682 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.529324 2581 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-103.ec2.internal\" not found" Apr 23 13:32:08.570881 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.570858 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 13:32:08.572154 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.572134 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 13:32:08.572154 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.572157 2581 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 13:32:08.572281 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.572173 2581 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 13:32:08.572281 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.572179 2581 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 13:32:08.572281 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.572208 2581 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 13:32:08.574926 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.574908 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:08.629296 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.629242 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:08.631932 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.631916 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:08.632010 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.631945 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:08.632010 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.631957 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:08.632010 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.631985 2581 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-103.ec2.internal" Apr 23 13:32:08.641941 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.641927 2581 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-103.ec2.internal" Apr 23 13:32:08.642000 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.641948 2581 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-103.ec2.internal\": node \"ip-10-0-129-103.ec2.internal\" not found" Apr 23 13:32:08.667600 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.667579 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-103.ec2.internal\" not found" Apr 23 13:32:08.672669 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.672656 2581 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-103.ec2.internal"] Apr 23 13:32:08.672716 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.672706 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:08.673452 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.673437 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:08.673512 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.673468 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:08.673512 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.673478 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:08.674605 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.674594 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:08.674739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.674719 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal" Apr 23 13:32:08.674813 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.674752 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:08.675218 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.675204 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:08.675303 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.675230 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:08.675303 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.675209 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:08.675303 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.675273 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:08.675303 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.675244 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:08.675303 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.675289 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:08.676190 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.676174 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-103.ec2.internal" Apr 23 13:32:08.676256 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.676206 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:08.676809 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.676794 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:08.676871 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.676823 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:08.676871 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.676837 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:08.705149 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.705126 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-103.ec2.internal\" not found" node="ip-10-0-129-103.ec2.internal" Apr 23 13:32:08.709609 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.709593 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-103.ec2.internal\" not found" node="ip-10-0-129-103.ec2.internal" Apr 23 13:32:08.766301 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.766273 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c89c3d2b9be5aaa6a987cf4d08bdee6e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal\" (UID: \"c89c3d2b9be5aaa6a987cf4d08bdee6e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal" Apr 23 13:32:08.766375 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.766305 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/418931079fe802d41c31b61d5b4bcb82-config\") pod \"kube-apiserver-proxy-ip-10-0-129-103.ec2.internal\" (UID: \"418931079fe802d41c31b61d5b4bcb82\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-103.ec2.internal" Apr 23 13:32:08.766375 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.766322 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c89c3d2b9be5aaa6a987cf4d08bdee6e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal\" (UID: \"c89c3d2b9be5aaa6a987cf4d08bdee6e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal" Apr 23 13:32:08.768419 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.768404 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-103.ec2.internal\" not found" Apr 23 13:32:08.866613 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.866590 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c89c3d2b9be5aaa6a987cf4d08bdee6e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal\" (UID: \"c89c3d2b9be5aaa6a987cf4d08bdee6e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal" Apr 23 13:32:08.866703 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.866615 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c89c3d2b9be5aaa6a987cf4d08bdee6e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal\" (UID: \"c89c3d2b9be5aaa6a987cf4d08bdee6e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal" Apr 23 13:32:08.866703 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.866643 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/418931079fe802d41c31b61d5b4bcb82-config\") pod \"kube-apiserver-proxy-ip-10-0-129-103.ec2.internal\" (UID: \"418931079fe802d41c31b61d5b4bcb82\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-103.ec2.internal" Apr 23 13:32:08.866703 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.866679 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/418931079fe802d41c31b61d5b4bcb82-config\") pod \"kube-apiserver-proxy-ip-10-0-129-103.ec2.internal\" (UID: \"418931079fe802d41c31b61d5b4bcb82\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-103.ec2.internal" Apr 23 13:32:08.866703 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.866680 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c89c3d2b9be5aaa6a987cf4d08bdee6e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal\" (UID: \"c89c3d2b9be5aaa6a987cf4d08bdee6e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal" Apr 23 13:32:08.866703 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:08.866680 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c89c3d2b9be5aaa6a987cf4d08bdee6e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal\" (UID: \"c89c3d2b9be5aaa6a987cf4d08bdee6e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal" Apr 23 13:32:08.868672 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.868655 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-103.ec2.internal\" not found" Apr 23 13:32:08.969418 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:08.969355 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-103.ec2.internal\" not found" Apr 23 13:32:09.007537 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.007504 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal" Apr 23 13:32:09.012057 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.012042 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-103.ec2.internal" Apr 23 13:32:09.069936 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:09.069910 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-103.ec2.internal\" not found" Apr 23 13:32:09.170395 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:09.170365 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-103.ec2.internal\" not found" Apr 23 13:32:09.270964 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:09.270900 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-103.ec2.internal\" not found" Apr 23 13:32:09.371378 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:09.371356 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-103.ec2.internal\" not found" Apr 23 13:32:09.380519 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.380496 2581 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 13:32:09.380681 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.380664 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:32:09.463736 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.463711 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 13:32:09.471455 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:09.471431 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-103.ec2.internal\" not found" Apr 23 13:32:09.477367 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.477341 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 13:27:08 +0000 UTC" deadline="2027-10-20 16:10:37.962283202 +0000 UTC" Apr 23 13:32:09.477367 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.477366 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13082h38m28.484919984s" Apr 23 13:32:09.479753 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.479737 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:32:09.544203 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.544185 2581 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:09.549476 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:09.549452 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc89c3d2b9be5aaa6a987cf4d08bdee6e.slice/crio-3f978313018b2983ad26dd966dd1d49e14734236dc7f8a0266f88cd8424ce4bc WatchSource:0}: Error finding container 3f978313018b2983ad26dd966dd1d49e14734236dc7f8a0266f88cd8424ce4bc: Status 404 returned error can't find the container with id 3f978313018b2983ad26dd966dd1d49e14734236dc7f8a0266f88cd8424ce4bc Apr 23 13:32:09.553126 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.553112 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:32:09.555375 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.555359 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-x5pvs" Apr 23 13:32:09.563275 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.563261 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-x5pvs" Apr 23 13:32:09.565384 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.565362 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal" Apr 23 13:32:09.573181 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:09.573138 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod418931079fe802d41c31b61d5b4bcb82.slice/crio-f5618807c4ff28e3889471b5ab4c3bdf991ab6ea409146ad2da8ce5160b4dcf1 WatchSource:0}: Error finding container f5618807c4ff28e3889471b5ab4c3bdf991ab6ea409146ad2da8ce5160b4dcf1: Status 404 returned error can't find the container with id f5618807c4ff28e3889471b5ab4c3bdf991ab6ea409146ad2da8ce5160b4dcf1 Apr 23 13:32:09.575752 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.575699 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal" event={"ID":"c89c3d2b9be5aaa6a987cf4d08bdee6e","Type":"ContainerStarted","Data":"3f978313018b2983ad26dd966dd1d49e14734236dc7f8a0266f88cd8424ce4bc"} Apr 23 13:32:09.575752 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.575729 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:32:09.577270 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.577257 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-103.ec2.internal" Apr 23 13:32:09.583669 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.583653 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:32:09.649777 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.649761 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:09.980721 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:09.980640 2581 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:10.446825 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.446792 2581 apiserver.go:52] "Watching apiserver" Apr 23 13:32:10.454888 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.454866 2581 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 13:32:10.455928 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.455903 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-h6brm","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds","openshift-cluster-node-tuning-operator/tuned-zjrsd","openshift-dns/node-resolver-qgrhz","openshift-image-registry/node-ca-mx792","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal","openshift-multus/multus-9vj47","openshift-multus/multus-additional-cni-plugins-vwqk2","kube-system/kube-apiserver-proxy-ip-10-0-129-103.ec2.internal","openshift-multus/network-metrics-daemon-ctn87","openshift-network-diagnostics/network-check-target-l79sj","openshift-network-operator/iptables-alerter-msldf","openshift-ovn-kubernetes/ovnkube-node-4shpw"] Apr 23 13:32:10.458828 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.458812 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.460921 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.460898 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.461343 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.461320 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 13:32:10.461441 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.461342 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 13:32:10.461441 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.461352 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 13:32:10.461441 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.461346 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 13:32:10.461699 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.461680 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8v757\"" Apr 23 13:32:10.463329 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.463084 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 13:32:10.463329 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.463228 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-pcrnb\"" Apr 23 13:32:10.463329 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.463232 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 13:32:10.464021 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.463989 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 13:32:10.465583 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.465171 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qgrhz" Apr 23 13:32:10.467155 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.467139 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 13:32:10.467814 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.467653 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-vk77b\"" Apr 23 13:32:10.467814 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.467703 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 13:32:10.467943 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.467811 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.469851 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.469831 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:32:10.469952 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.469850 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 13:32:10.470044 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.470028 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mx792" Apr 23 13:32:10.470396 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.470382 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-sr2g2\"" Apr 23 13:32:10.472020 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.472002 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 13:32:10.472147 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.472133 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 13:32:10.472232 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.472219 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jcpkp\"" Apr 23 13:32:10.472299 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.472287 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 13:32:10.472384 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.472367 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.474438 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.474418 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-registration-dir\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.474559 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zvfbj\"" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.474722 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.474940 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-tuned\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.474984 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-os-release\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475012 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-hostroot\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475048 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475071 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475208 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-systemd\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475242 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3b32a45a-dd10-48d7-9261-50a4c50f588a-tmp\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475273 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b35db408-8233-451b-984c-90d90df7a815-hosts-file\") pod \"node-resolver-qgrhz\" (UID: \"b35db408-8233-451b-984c-90d90df7a815\") " pod="openshift-dns/node-resolver-qgrhz" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475299 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqspc\" (UniqueName: \"kubernetes.io/projected/b35db408-8233-451b-984c-90d90df7a815-kube-api-access-nqspc\") pod \"node-resolver-qgrhz\" (UID: \"b35db408-8233-451b-984c-90d90df7a815\") " pod="openshift-dns/node-resolver-qgrhz" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475402 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c-host\") pod \"node-ca-mx792\" (UID: \"01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c\") " pod="openshift-image-registry/node-ca-mx792" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475454 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8ebc3830-6349-407a-984d-4ca78ca8e182-multus-daemon-config\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475622 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-etc-kubernetes\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475655 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-modprobe-d\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475674 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-sysctl-d\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475704 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-system-cni-dir\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475737 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-multus-cni-dir\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.477004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475786 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-device-dir\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.477877 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475816 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-etc-selinux\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.477877 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475847 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-host\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.477877 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.475993 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9q7l\" (UniqueName: \"kubernetes.io/projected/3b32a45a-dd10-48d7-9261-50a4c50f588a-kube-api-access-c9q7l\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.477877 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476020 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qkhk\" (UniqueName: \"kubernetes.io/projected/01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c-kube-api-access-5qkhk\") pod \"node-ca-mx792\" (UID: \"01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c\") " pod="openshift-image-registry/node-ca-mx792" Apr 23 13:32:10.477877 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476024 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:10.477877 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476045 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-sys-fs\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.477877 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476065 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-cnibin\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.477877 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476083 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-kubernetes\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.477877 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476104 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-sys\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.477877 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476119 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-lib-modules\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.477877 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:10.476158 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctn87" podUID="e9cbb1cc-dcfc-4dac-99b7-8363fbef7774" Apr 23 13:32:10.477877 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476181 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-multus-socket-dir-parent\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.477877 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476200 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-run-k8s-cni-cncf-io\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.477877 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476219 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-multus-conf-dir\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.477877 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476237 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4jbw\" (UniqueName: \"kubernetes.io/projected/8ebc3830-6349-407a-984d-4ca78ca8e182-kube-api-access-n4jbw\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.477877 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476294 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-sysconfig\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.477877 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476309 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b35db408-8233-451b-984c-90d90df7a815-tmp-dir\") pod \"node-resolver-qgrhz\" (UID: \"b35db408-8233-451b-984c-90d90df7a815\") " pod="openshift-dns/node-resolver-qgrhz" Apr 23 13:32:10.478578 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476328 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-run-multus-certs\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.478578 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476382 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h6brm" Apr 23 13:32:10.478578 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476396 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ms66\" (UniqueName: \"kubernetes.io/projected/2be66145-0b39-40a6-8234-797ef2cbdb16-kube-api-access-6ms66\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.478578 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476418 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-sysctl-conf\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.478578 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476460 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-run\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.478578 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476479 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-var-lib-kubelet\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.478578 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476553 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-var-lib-cni-bin\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.478578 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476865 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c-serviceca\") pod \"node-ca-mx792\" (UID: \"01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c\") " pod="openshift-image-registry/node-ca-mx792" Apr 23 13:32:10.478578 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476905 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ebc3830-6349-407a-984d-4ca78ca8e182-cni-binary-copy\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.478578 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476935 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-run-netns\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.478578 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.476962 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-var-lib-cni-multus\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.478578 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.477096 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-var-lib-kubelet\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.478578 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.477129 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-socket-dir\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.479437 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.479192 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 13:32:10.479437 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.479206 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dswd5\"" Apr 23 13:32:10.479437 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.479321 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:10.479437 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:10.479395 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l79sj" podUID="774a8870-9d9e-4314-a059-b58aad91c605" Apr 23 13:32:10.479641 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.479496 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 13:32:10.481826 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.481809 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-msldf" Apr 23 13:32:10.483952 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.483926 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:32:10.484052 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.484037 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 13:32:10.484115 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.484052 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-58r5z\"" Apr 23 13:32:10.484165 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.484134 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 13:32:10.485782 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.485761 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.488013 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.487864 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 13:32:10.488718 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.488684 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 13:32:10.488990 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.488974 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 13:32:10.489162 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.489091 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 13:32:10.489162 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.489137 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 13:32:10.489347 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.489248 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8cjws\"" Apr 23 13:32:10.489347 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.489304 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 13:32:10.564562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.564498 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:27:09 +0000 UTC" deadline="2027-11-25 06:28:37.487349623 +0000 UTC" Apr 23 13:32:10.564562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.564552 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13936h56m26.922802023s" Apr 23 13:32:10.566328 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.566152 2581 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 13:32:10.577274 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577254 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe05d38e-d020-46dd-95d4-832fb5c93359-system-cni-dir\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.577382 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577285 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-run-netns\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.577382 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577314 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-var-lib-cni-multus\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.577382 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577330 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-var-lib-kubelet\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.577382 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577345 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3b32a45a-dd10-48d7-9261-50a4c50f588a-tmp\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.577382 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577359 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqspc\" (UniqueName: \"kubernetes.io/projected/b35db408-8233-451b-984c-90d90df7a815-kube-api-access-nqspc\") pod \"node-resolver-qgrhz\" (UID: \"b35db408-8233-451b-984c-90d90df7a815\") " pod="openshift-dns/node-resolver-qgrhz" Apr 23 13:32:10.577382 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577363 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-run-netns\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.577382 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577380 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c-host\") pod \"node-ca-mx792\" (UID: \"01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c\") " pod="openshift-image-registry/node-ca-mx792" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577391 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-var-lib-cni-multus\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577402 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-var-lib-kubelet\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577414 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c-host\") pod \"node-ca-mx792\" (UID: \"01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c\") " pod="openshift-image-registry/node-ca-mx792" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577415 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-cni-netd\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577446 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-os-release\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577469 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-hostroot\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577491 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577514 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b35db408-8233-451b-984c-90d90df7a815-hosts-file\") pod \"node-resolver-qgrhz\" (UID: \"b35db408-8233-451b-984c-90d90df7a815\") " pod="openshift-dns/node-resolver-qgrhz" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577519 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-hostroot\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577517 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-os-release\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577572 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-run-systemd\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577583 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b35db408-8233-451b-984c-90d90df7a815-hosts-file\") pod \"node-resolver-qgrhz\" (UID: \"b35db408-8233-451b-984c-90d90df7a815\") " pod="openshift-dns/node-resolver-qgrhz" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577597 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8eaf8674-35ae-40d6-b12b-07e254516721-ovnkube-config\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577600 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577632 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8ebc3830-6349-407a-984d-4ca78ca8e182-multus-daemon-config\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577658 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-modprobe-d\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577677 2581 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 13:32:10.577739 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577691 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe05d38e-d020-46dd-95d4-832fb5c93359-cnibin\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577709 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-etc-selinux\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577724 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-host\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577738 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-node-log\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577754 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-run-ovn-kubernetes\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577772 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-cnibin\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577770 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-modprobe-d\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577806 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-kubernetes\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577829 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-sys\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577854 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-run-k8s-cni-cncf-io\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577857 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-etc-selinux\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577882 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-kubernetes\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577877 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b35db408-8233-451b-984c-90d90df7a815-tmp-dir\") pod \"node-resolver-qgrhz\" (UID: \"b35db408-8233-451b-984c-90d90df7a815\") " pod="openshift-dns/node-resolver-qgrhz" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577831 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-host\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577918 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4srt\" (UniqueName: \"kubernetes.io/projected/50b7daad-20fe-4160-ba67-2e5371f39d68-kube-api-access-v4srt\") pod \"iptables-alerter-msldf\" (UID: \"50b7daad-20fe-4160-ba67-2e5371f39d68\") " pod="openshift-network-operator/iptables-alerter-msldf" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577924 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-cnibin\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577938 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ms66\" (UniqueName: \"kubernetes.io/projected/2be66145-0b39-40a6-8234-797ef2cbdb16-kube-api-access-6ms66\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577943 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-sys\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.578610 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577961 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-run\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.579386 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.577977 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-var-lib-kubelet\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.579386 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578007 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-run-netns\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.579386 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578034 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-run-ovn\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.579386 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578058 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-var-lib-cni-bin\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.579386 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578084 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/50b7daad-20fe-4160-ba67-2e5371f39d68-host-slash\") pod \"iptables-alerter-msldf\" (UID: \"50b7daad-20fe-4160-ba67-2e5371f39d68\") " pod="openshift-network-operator/iptables-alerter-msldf" Apr 23 13:32:10.579386 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578100 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-var-lib-kubelet\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.579386 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578109 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-kubelet\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.579386 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578156 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-systemd-units\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.579386 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578209 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-run\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.579386 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578229 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b35db408-8233-451b-984c-90d90df7a815-tmp-dir\") pod \"node-resolver-qgrhz\" (UID: \"b35db408-8233-451b-984c-90d90df7a815\") " pod="openshift-dns/node-resolver-qgrhz" Apr 23 13:32:10.579386 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578251 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8ebc3830-6349-407a-984d-4ca78ca8e182-multus-daemon-config\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.579386 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578309 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-run-k8s-cni-cncf-io\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.579386 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578329 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-slash\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.579386 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578344 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-103.ec2.internal" event={"ID":"418931079fe802d41c31b61d5b4bcb82","Type":"ContainerStarted","Data":"f5618807c4ff28e3889471b5ab4c3bdf991ab6ea409146ad2da8ce5160b4dcf1"} Apr 23 13:32:10.579386 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578355 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.579386 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578379 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8eaf8674-35ae-40d6-b12b-07e254516721-env-overrides\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.579386 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578405 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qkhk\" (UniqueName: \"kubernetes.io/projected/01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c-kube-api-access-5qkhk\") pod \"node-ca-mx792\" (UID: \"01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c\") " pod="openshift-image-registry/node-ca-mx792" Apr 23 13:32:10.580008 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578429 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ebc3830-6349-407a-984d-4ca78ca8e182-cni-binary-copy\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.580008 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578455 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-socket-dir\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.580008 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578479 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-registration-dir\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.580008 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578499 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-var-lib-cni-bin\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.580008 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578507 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-tuned\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.580008 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578560 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8eaf8674-35ae-40d6-b12b-07e254516721-ovnkube-script-lib\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.580008 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578576 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-registration-dir\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.580008 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578622 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fe05d38e-d020-46dd-95d4-832fb5c93359-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.580008 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578673 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-systemd\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.580008 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578698 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe05d38e-d020-46dd-95d4-832fb5c93359-os-release\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.580008 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578720 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe05d38e-d020-46dd-95d4-832fb5c93359-cni-binary-copy\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.580008 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578747 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-socket-dir\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.580008 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578763 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-etc-kubernetes\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.580008 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578787 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-sysctl-d\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.580008 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578845 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/50b7daad-20fe-4160-ba67-2e5371f39d68-iptables-alerter-script\") pod \"iptables-alerter-msldf\" (UID: \"50b7daad-20fe-4160-ba67-2e5371f39d68\") " pod="openshift-network-operator/iptables-alerter-msldf" Apr 23 13:32:10.580008 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578863 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-etc-kubernetes\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.580008 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578872 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-system-cni-dir\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.580562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578898 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-multus-cni-dir\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.580562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578923 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-device-dir\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.580562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578942 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-systemd\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.580562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578944 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-system-cni-dir\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.580562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578946 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9q7l\" (UniqueName: \"kubernetes.io/projected/3b32a45a-dd10-48d7-9261-50a4c50f588a-kube-api-access-c9q7l\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.580562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578977 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-sysctl-d\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.580562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.578990 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-multus-cni-dir\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.580562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579007 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bbjs\" (UniqueName: \"kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs\") pod \"network-check-target-l79sj\" (UID: \"774a8870-9d9e-4314-a059-b58aad91c605\") " pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:10.580562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579028 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-device-dir\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.580562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579035 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-log-socket\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.580562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579062 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4vm4\" (UniqueName: \"kubernetes.io/projected/8eaf8674-35ae-40d6-b12b-07e254516721-kube-api-access-k4vm4\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.580562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579104 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs\") pod \"network-metrics-daemon-ctn87\" (UID: \"e9cbb1cc-dcfc-4dac-99b7-8363fbef7774\") " pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:10.580562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579131 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-sys-fs\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.580562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579166 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-lib-modules\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.580562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579187 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ebc3830-6349-407a-984d-4ca78ca8e182-cni-binary-copy\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.580562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579201 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe05d38e-d020-46dd-95d4-832fb5c93359-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.580562 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579226 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2be66145-0b39-40a6-8234-797ef2cbdb16-sys-fs\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.581306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579231 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zznnl\" (UniqueName: \"kubernetes.io/projected/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-kube-api-access-zznnl\") pod \"network-metrics-daemon-ctn87\" (UID: \"e9cbb1cc-dcfc-4dac-99b7-8363fbef7774\") " pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:10.581306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579252 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-lib-modules\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.581306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579276 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-multus-socket-dir-parent\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.581306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579303 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-multus-conf-dir\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.581306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579323 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4jbw\" (UniqueName: \"kubernetes.io/projected/8ebc3830-6349-407a-984d-4ca78ca8e182-kube-api-access-n4jbw\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.581306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579342 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-sysconfig\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.581306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579344 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-multus-socket-dir-parent\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.581306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579364 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/267b4640-e7c2-4100-9c7d-2623b5ee12fd-agent-certs\") pod \"konnectivity-agent-h6brm\" (UID: \"267b4640-e7c2-4100-9c7d-2623b5ee12fd\") " pod="kube-system/konnectivity-agent-h6brm" Apr 23 13:32:10.581306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579389 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/267b4640-e7c2-4100-9c7d-2623b5ee12fd-konnectivity-ca\") pod \"konnectivity-agent-h6brm\" (UID: \"267b4640-e7c2-4100-9c7d-2623b5ee12fd\") " pod="kube-system/konnectivity-agent-h6brm" Apr 23 13:32:10.581306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579462 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-var-lib-openvswitch\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.581306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579485 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-etc-openvswitch\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.581306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579506 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-run-multus-certs\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.581306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579547 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-sysctl-conf\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.581306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579570 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-sysconfig\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.581306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579574 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-run-openvswitch\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.581306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579547 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-multus-conf-dir\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.581306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579612 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-cni-bin\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.582094 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579634 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8eaf8674-35ae-40d6-b12b-07e254516721-ovn-node-metrics-cert\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.582094 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579641 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8ebc3830-6349-407a-984d-4ca78ca8e182-host-run-multus-certs\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.582094 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579652 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fe05d38e-d020-46dd-95d4-832fb5c93359-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.582094 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579676 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-sysctl-conf\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.582094 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579690 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nv7p\" (UniqueName: \"kubernetes.io/projected/fe05d38e-d020-46dd-95d4-832fb5c93359-kube-api-access-6nv7p\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.582094 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.579723 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c-serviceca\") pod \"node-ca-mx792\" (UID: \"01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c\") " pod="openshift-image-registry/node-ca-mx792" Apr 23 13:32:10.582094 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.580126 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c-serviceca\") pod \"node-ca-mx792\" (UID: \"01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c\") " pod="openshift-image-registry/node-ca-mx792" Apr 23 13:32:10.582094 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.581162 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3b32a45a-dd10-48d7-9261-50a4c50f588a-etc-tuned\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.582094 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.581198 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3b32a45a-dd10-48d7-9261-50a4c50f588a-tmp\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.586472 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.586418 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqspc\" (UniqueName: \"kubernetes.io/projected/b35db408-8233-451b-984c-90d90df7a815-kube-api-access-nqspc\") pod \"node-resolver-qgrhz\" (UID: \"b35db408-8233-451b-984c-90d90df7a815\") " pod="openshift-dns/node-resolver-qgrhz" Apr 23 13:32:10.586900 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.586880 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ms66\" (UniqueName: \"kubernetes.io/projected/2be66145-0b39-40a6-8234-797ef2cbdb16-kube-api-access-6ms66\") pod \"aws-ebs-csi-driver-node-gktds\" (UID: \"2be66145-0b39-40a6-8234-797ef2cbdb16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.587729 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.587707 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qkhk\" (UniqueName: \"kubernetes.io/projected/01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c-kube-api-access-5qkhk\") pod \"node-ca-mx792\" (UID: \"01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c\") " pod="openshift-image-registry/node-ca-mx792" Apr 23 13:32:10.587901 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.587866 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4jbw\" (UniqueName: \"kubernetes.io/projected/8ebc3830-6349-407a-984d-4ca78ca8e182-kube-api-access-n4jbw\") pod \"multus-9vj47\" (UID: \"8ebc3830-6349-407a-984d-4ca78ca8e182\") " pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.588640 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.588624 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9q7l\" (UniqueName: \"kubernetes.io/projected/3b32a45a-dd10-48d7-9261-50a4c50f588a-kube-api-access-c9q7l\") pod \"tuned-zjrsd\" (UID: \"3b32a45a-dd10-48d7-9261-50a4c50f588a\") " pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.680916 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.680878 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-cni-netd\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.680916 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.680910 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-run-systemd\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681118 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.680926 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8eaf8674-35ae-40d6-b12b-07e254516721-ovnkube-config\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681118 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.680982 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-run-systemd\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681118 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.680987 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-cni-netd\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681118 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681007 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe05d38e-d020-46dd-95d4-832fb5c93359-cnibin\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.681118 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681031 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-node-log\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681118 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681048 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-run-ovn-kubernetes\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681118 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681066 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4srt\" (UniqueName: \"kubernetes.io/projected/50b7daad-20fe-4160-ba67-2e5371f39d68-kube-api-access-v4srt\") pod \"iptables-alerter-msldf\" (UID: \"50b7daad-20fe-4160-ba67-2e5371f39d68\") " pod="openshift-network-operator/iptables-alerter-msldf" Apr 23 13:32:10.681118 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681084 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-run-netns\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681130 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-node-log\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681135 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-run-ovn-kubernetes\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681167 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe05d38e-d020-46dd-95d4-832fb5c93359-cnibin\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.681482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681177 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-run-ovn\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681203 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/50b7daad-20fe-4160-ba67-2e5371f39d68-host-slash\") pod \"iptables-alerter-msldf\" (UID: \"50b7daad-20fe-4160-ba67-2e5371f39d68\") " pod="openshift-network-operator/iptables-alerter-msldf" Apr 23 13:32:10.681482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681205 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-run-netns\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681232 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-run-ovn\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681242 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/50b7daad-20fe-4160-ba67-2e5371f39d68-host-slash\") pod \"iptables-alerter-msldf\" (UID: \"50b7daad-20fe-4160-ba67-2e5371f39d68\") " pod="openshift-network-operator/iptables-alerter-msldf" Apr 23 13:32:10.681482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681259 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-kubelet\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681283 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-systemd-units\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681304 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-slash\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681334 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-kubelet\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681328 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681360 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-systemd-units\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681368 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-slash\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681373 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8eaf8674-35ae-40d6-b12b-07e254516721-env-overrides\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.681482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681400 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.682284 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681404 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8eaf8674-35ae-40d6-b12b-07e254516721-ovnkube-config\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.682284 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681411 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8eaf8674-35ae-40d6-b12b-07e254516721-ovnkube-script-lib\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.682284 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681437 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fe05d38e-d020-46dd-95d4-832fb5c93359-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.682284 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681521 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe05d38e-d020-46dd-95d4-832fb5c93359-os-release\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.682284 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681575 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe05d38e-d020-46dd-95d4-832fb5c93359-cni-binary-copy\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.682284 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681596 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe05d38e-d020-46dd-95d4-832fb5c93359-os-release\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.682284 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681602 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/50b7daad-20fe-4160-ba67-2e5371f39d68-iptables-alerter-script\") pod \"iptables-alerter-msldf\" (UID: \"50b7daad-20fe-4160-ba67-2e5371f39d68\") " pod="openshift-network-operator/iptables-alerter-msldf" Apr 23 13:32:10.682284 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681641 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bbjs\" (UniqueName: \"kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs\") pod \"network-check-target-l79sj\" (UID: \"774a8870-9d9e-4314-a059-b58aad91c605\") " pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:10.682284 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681673 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-log-socket\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.682284 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681704 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4vm4\" (UniqueName: \"kubernetes.io/projected/8eaf8674-35ae-40d6-b12b-07e254516721-kube-api-access-k4vm4\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.682284 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681704 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8eaf8674-35ae-40d6-b12b-07e254516721-env-overrides\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.682284 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681737 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs\") pod \"network-metrics-daemon-ctn87\" (UID: \"e9cbb1cc-dcfc-4dac-99b7-8363fbef7774\") " pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:10.682284 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681758 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe05d38e-d020-46dd-95d4-832fb5c93359-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.682284 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681774 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zznnl\" (UniqueName: \"kubernetes.io/projected/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-kube-api-access-zznnl\") pod \"network-metrics-daemon-ctn87\" (UID: \"e9cbb1cc-dcfc-4dac-99b7-8363fbef7774\") " pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:10.682284 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681799 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/267b4640-e7c2-4100-9c7d-2623b5ee12fd-agent-certs\") pod \"konnectivity-agent-h6brm\" (UID: \"267b4640-e7c2-4100-9c7d-2623b5ee12fd\") " pod="kube-system/konnectivity-agent-h6brm" Apr 23 13:32:10.682284 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681815 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/267b4640-e7c2-4100-9c7d-2623b5ee12fd-konnectivity-ca\") pod \"konnectivity-agent-h6brm\" (UID: \"267b4640-e7c2-4100-9c7d-2623b5ee12fd\") " pod="kube-system/konnectivity-agent-h6brm" Apr 23 13:32:10.682284 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681829 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-var-lib-openvswitch\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.682834 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681852 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-etc-openvswitch\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.682834 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681879 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-run-openvswitch\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.682834 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681904 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-cni-bin\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.682834 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681928 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8eaf8674-35ae-40d6-b12b-07e254516721-ovn-node-metrics-cert\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.682834 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681954 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fe05d38e-d020-46dd-95d4-832fb5c93359-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.682834 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.681984 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nv7p\" (UniqueName: \"kubernetes.io/projected/fe05d38e-d020-46dd-95d4-832fb5c93359-kube-api-access-6nv7p\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.682834 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.682013 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe05d38e-d020-46dd-95d4-832fb5c93359-system-cni-dir\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.682834 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.682012 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8eaf8674-35ae-40d6-b12b-07e254516721-ovnkube-script-lib\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.682834 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.682077 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/50b7daad-20fe-4160-ba67-2e5371f39d68-iptables-alerter-script\") pod \"iptables-alerter-msldf\" (UID: \"50b7daad-20fe-4160-ba67-2e5371f39d68\") " pod="openshift-network-operator/iptables-alerter-msldf" Apr 23 13:32:10.682834 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.682085 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-run-openvswitch\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.682834 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.682079 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe05d38e-d020-46dd-95d4-832fb5c93359-system-cni-dir\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.682834 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:10.682079 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:10.682834 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.682138 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-etc-openvswitch\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.682834 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.682142 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-var-lib-openvswitch\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.682834 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.682120 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe05d38e-d020-46dd-95d4-832fb5c93359-cni-binary-copy\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.682834 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:10.682184 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs podName:e9cbb1cc-dcfc-4dac-99b7-8363fbef7774 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:11.182152814 +0000 UTC m=+3.122327251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs") pod "network-metrics-daemon-ctn87" (UID: "e9cbb1cc-dcfc-4dac-99b7-8363fbef7774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:10.682834 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.682221 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-host-cni-bin\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.683797 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.682413 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fe05d38e-d020-46dd-95d4-832fb5c93359-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.683797 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.682484 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8eaf8674-35ae-40d6-b12b-07e254516721-log-socket\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.683797 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.682619 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe05d38e-d020-46dd-95d4-832fb5c93359-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.683797 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.682913 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/267b4640-e7c2-4100-9c7d-2623b5ee12fd-konnectivity-ca\") pod \"konnectivity-agent-h6brm\" (UID: \"267b4640-e7c2-4100-9c7d-2623b5ee12fd\") " pod="kube-system/konnectivity-agent-h6brm" Apr 23 13:32:10.683797 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.682986 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fe05d38e-d020-46dd-95d4-832fb5c93359-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.685248 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.685225 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/267b4640-e7c2-4100-9c7d-2623b5ee12fd-agent-certs\") pod \"konnectivity-agent-h6brm\" (UID: \"267b4640-e7c2-4100-9c7d-2623b5ee12fd\") " pod="kube-system/konnectivity-agent-h6brm" Apr 23 13:32:10.685753 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.685729 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8eaf8674-35ae-40d6-b12b-07e254516721-ovn-node-metrics-cert\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.687671 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:10.687653 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:10.687671 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:10.687671 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:10.687850 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:10.687682 2581 projected.go:194] Error preparing data for projected volume kube-api-access-7bbjs for pod openshift-network-diagnostics/network-check-target-l79sj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:10.687850 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:10.687734 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs podName:774a8870-9d9e-4314-a059-b58aad91c605 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:11.18772183 +0000 UTC m=+3.127896268 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7bbjs" (UniqueName: "kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs") pod "network-check-target-l79sj" (UID: "774a8870-9d9e-4314-a059-b58aad91c605") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:10.688885 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.688862 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4srt\" (UniqueName: \"kubernetes.io/projected/50b7daad-20fe-4160-ba67-2e5371f39d68-kube-api-access-v4srt\") pod \"iptables-alerter-msldf\" (UID: \"50b7daad-20fe-4160-ba67-2e5371f39d68\") " pod="openshift-network-operator/iptables-alerter-msldf" Apr 23 13:32:10.690234 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.690217 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4vm4\" (UniqueName: \"kubernetes.io/projected/8eaf8674-35ae-40d6-b12b-07e254516721-kube-api-access-k4vm4\") pod \"ovnkube-node-4shpw\" (UID: \"8eaf8674-35ae-40d6-b12b-07e254516721\") " pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.691379 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.691358 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zznnl\" (UniqueName: \"kubernetes.io/projected/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-kube-api-access-zznnl\") pod \"network-metrics-daemon-ctn87\" (UID: \"e9cbb1cc-dcfc-4dac-99b7-8363fbef7774\") " pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:10.691475 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.691456 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nv7p\" (UniqueName: \"kubernetes.io/projected/fe05d38e-d020-46dd-95d4-832fb5c93359-kube-api-access-6nv7p\") pod \"multus-additional-cni-plugins-vwqk2\" (UID: \"fe05d38e-d020-46dd-95d4-832fb5c93359\") " pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.770162 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.770048 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9vj47" Apr 23 13:32:10.780827 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.780792 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" Apr 23 13:32:10.789434 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.789415 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qgrhz" Apr 23 13:32:10.794314 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.794293 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" Apr 23 13:32:10.800830 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.800812 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mx792" Apr 23 13:32:10.807321 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.807305 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vwqk2" Apr 23 13:32:10.813865 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.813832 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h6brm" Apr 23 13:32:10.821372 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.821351 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-msldf" Apr 23 13:32:10.826955 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.826931 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:10.873457 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:10.872991 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:11.185508 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:11.185438 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs\") pod \"network-metrics-daemon-ctn87\" (UID: \"e9cbb1cc-dcfc-4dac-99b7-8363fbef7774\") " pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:11.185652 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:11.185568 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:11.185652 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:11.185628 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs podName:e9cbb1cc-dcfc-4dac-99b7-8363fbef7774 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:12.185612124 +0000 UTC m=+4.125786564 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs") pod "network-metrics-daemon-ctn87" (UID: "e9cbb1cc-dcfc-4dac-99b7-8363fbef7774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:11.223002 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:11.222974 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ebc3830_6349_407a_984d_4ca78ca8e182.slice/crio-34925ec58af7a8cb75418f39ce26f97008c095427046f76f249008c446105925 WatchSource:0}: Error finding container 34925ec58af7a8cb75418f39ce26f97008c095427046f76f249008c446105925: Status 404 returned error can't find the container with id 34925ec58af7a8cb75418f39ce26f97008c095427046f76f249008c446105925 Apr 23 13:32:11.224815 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:11.224788 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2be66145_0b39_40a6_8234_797ef2cbdb16.slice/crio-14b36e0c6cb416f589089f2120a43198ad9617657fb515e7b96937d1c553a780 WatchSource:0}: Error finding container 14b36e0c6cb416f589089f2120a43198ad9617657fb515e7b96937d1c553a780: Status 404 returned error can't find the container with id 14b36e0c6cb416f589089f2120a43198ad9617657fb515e7b96937d1c553a780 Apr 23 13:32:11.227612 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:11.227573 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe05d38e_d020_46dd_95d4_832fb5c93359.slice/crio-2070f40761333fa6c149c8f97266e47b679e10b60639f0580c3f481118e93ab5 WatchSource:0}: Error finding container 2070f40761333fa6c149c8f97266e47b679e10b60639f0580c3f481118e93ab5: Status 404 returned error can't find the container with id 2070f40761333fa6c149c8f97266e47b679e10b60639f0580c3f481118e93ab5 Apr 23 13:32:11.228311 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:11.228261 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b32a45a_dd10_48d7_9261_50a4c50f588a.slice/crio-70f80458fed310346789eb2e76179e5f8c2a2a1a3408426569a5e03656a4306f WatchSource:0}: Error finding container 70f80458fed310346789eb2e76179e5f8c2a2a1a3408426569a5e03656a4306f: Status 404 returned error can't find the container with id 70f80458fed310346789eb2e76179e5f8c2a2a1a3408426569a5e03656a4306f Apr 23 13:32:11.230052 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:11.229586 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb35db408_8233_451b_984c_90d90df7a815.slice/crio-8f88baa0249f5a2afe81ceda28cbf2a0a95036b1766dfee630b43d36447320df WatchSource:0}: Error finding container 8f88baa0249f5a2afe81ceda28cbf2a0a95036b1766dfee630b43d36447320df: Status 404 returned error can't find the container with id 8f88baa0249f5a2afe81ceda28cbf2a0a95036b1766dfee630b43d36447320df Apr 23 13:32:11.230379 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:11.230361 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01fe271a_4dd9_4cd9_8fd7_07b0808bdb7c.slice/crio-aa3a171a0ece9810424c8db8476fa8db03e1ba5885017392c191fbc237b73e8e WatchSource:0}: Error finding container aa3a171a0ece9810424c8db8476fa8db03e1ba5885017392c191fbc237b73e8e: Status 404 returned error can't find the container with id aa3a171a0ece9810424c8db8476fa8db03e1ba5885017392c191fbc237b73e8e Apr 23 13:32:11.230952 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:11.230929 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod267b4640_e7c2_4100_9c7d_2623b5ee12fd.slice/crio-68ea33e4cb4c21e02494af8d85eda9abcd6be55c4b0d4df8de12d1d7b545dada WatchSource:0}: Error finding container 68ea33e4cb4c21e02494af8d85eda9abcd6be55c4b0d4df8de12d1d7b545dada: Status 404 returned error can't find the container with id 68ea33e4cb4c21e02494af8d85eda9abcd6be55c4b0d4df8de12d1d7b545dada Apr 23 13:32:11.231771 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:11.231747 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50b7daad_20fe_4160_ba67_2e5371f39d68.slice/crio-f0323d5f0143e15c00b025ca14237f80660602d5701de6bcca5358a04b804f41 WatchSource:0}: Error finding container f0323d5f0143e15c00b025ca14237f80660602d5701de6bcca5358a04b804f41: Status 404 returned error can't find the container with id f0323d5f0143e15c00b025ca14237f80660602d5701de6bcca5358a04b804f41 Apr 23 13:32:11.232561 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:11.232392 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eaf8674_35ae_40d6_b12b_07e254516721.slice/crio-29c4c31b3529daa9b348f35bad569b7815d8c3985840d2d2a3f9648a0173f84a WatchSource:0}: Error finding container 29c4c31b3529daa9b348f35bad569b7815d8c3985840d2d2a3f9648a0173f84a: Status 404 returned error can't find the container with id 29c4c31b3529daa9b348f35bad569b7815d8c3985840d2d2a3f9648a0173f84a Apr 23 13:32:11.285914 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:11.285887 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bbjs\" (UniqueName: \"kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs\") pod \"network-check-target-l79sj\" (UID: \"774a8870-9d9e-4314-a059-b58aad91c605\") " pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:11.286059 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:11.286042 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:11.286098 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:11.286065 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:11.286098 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:11.286075 2581 projected.go:194] Error preparing data for projected volume kube-api-access-7bbjs for pod openshift-network-diagnostics/network-check-target-l79sj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:11.286172 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:11.286130 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs podName:774a8870-9d9e-4314-a059-b58aad91c605 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:12.286104609 +0000 UTC m=+4.226279048 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7bbjs" (UniqueName: "kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs") pod "network-check-target-l79sj" (UID: "774a8870-9d9e-4314-a059-b58aad91c605") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:11.565854 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:11.565669 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:27:09 +0000 UTC" deadline="2027-10-12 14:22:26.870095772 +0000 UTC" Apr 23 13:32:11.565854 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:11.565704 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12888h50m15.304394762s" Apr 23 13:32:11.583029 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:11.582980 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" event={"ID":"3b32a45a-dd10-48d7-9261-50a4c50f588a","Type":"ContainerStarted","Data":"70f80458fed310346789eb2e76179e5f8c2a2a1a3408426569a5e03656a4306f"} Apr 23 13:32:11.585721 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:11.585668 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h6brm" event={"ID":"267b4640-e7c2-4100-9c7d-2623b5ee12fd","Type":"ContainerStarted","Data":"68ea33e4cb4c21e02494af8d85eda9abcd6be55c4b0d4df8de12d1d7b545dada"} Apr 23 13:32:11.587289 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:11.587237 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mx792" event={"ID":"01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c","Type":"ContainerStarted","Data":"aa3a171a0ece9810424c8db8476fa8db03e1ba5885017392c191fbc237b73e8e"} Apr 23 13:32:11.590170 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:11.590112 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qgrhz" event={"ID":"b35db408-8233-451b-984c-90d90df7a815","Type":"ContainerStarted","Data":"8f88baa0249f5a2afe81ceda28cbf2a0a95036b1766dfee630b43d36447320df"} Apr 23 13:32:11.591838 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:11.591788 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" event={"ID":"8eaf8674-35ae-40d6-b12b-07e254516721","Type":"ContainerStarted","Data":"29c4c31b3529daa9b348f35bad569b7815d8c3985840d2d2a3f9648a0173f84a"} Apr 23 13:32:11.598653 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:11.597165 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-msldf" event={"ID":"50b7daad-20fe-4160-ba67-2e5371f39d68","Type":"ContainerStarted","Data":"f0323d5f0143e15c00b025ca14237f80660602d5701de6bcca5358a04b804f41"} Apr 23 13:32:11.600012 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:11.599988 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" event={"ID":"2be66145-0b39-40a6-8234-797ef2cbdb16","Type":"ContainerStarted","Data":"14b36e0c6cb416f589089f2120a43198ad9617657fb515e7b96937d1c553a780"} Apr 23 13:32:11.604505 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:11.604461 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwqk2" event={"ID":"fe05d38e-d020-46dd-95d4-832fb5c93359","Type":"ContainerStarted","Data":"2070f40761333fa6c149c8f97266e47b679e10b60639f0580c3f481118e93ab5"} Apr 23 13:32:11.609684 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:11.609638 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9vj47" event={"ID":"8ebc3830-6349-407a-984d-4ca78ca8e182","Type":"ContainerStarted","Data":"34925ec58af7a8cb75418f39ce26f97008c095427046f76f249008c446105925"} Apr 23 13:32:11.614647 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:11.614623 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-103.ec2.internal" event={"ID":"418931079fe802d41c31b61d5b4bcb82","Type":"ContainerStarted","Data":"67c6c2df07adfd5e7117c53b0913325bf9df5f9b1fc93052167e2c2fcbbed7d4"} Apr 23 13:32:12.193337 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:12.193288 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs\") pod \"network-metrics-daemon-ctn87\" (UID: \"e9cbb1cc-dcfc-4dac-99b7-8363fbef7774\") " pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:12.193514 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:12.193465 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:12.193639 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:12.193550 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs podName:e9cbb1cc-dcfc-4dac-99b7-8363fbef7774 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:14.193510961 +0000 UTC m=+6.133685401 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs") pod "network-metrics-daemon-ctn87" (UID: "e9cbb1cc-dcfc-4dac-99b7-8363fbef7774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:12.294157 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:12.294063 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bbjs\" (UniqueName: \"kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs\") pod \"network-check-target-l79sj\" (UID: \"774a8870-9d9e-4314-a059-b58aad91c605\") " pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:12.294319 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:12.294237 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:12.294319 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:12.294255 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:12.294319 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:12.294269 2581 projected.go:194] Error preparing data for projected volume kube-api-access-7bbjs for pod openshift-network-diagnostics/network-check-target-l79sj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:12.294474 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:12.294333 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs podName:774a8870-9d9e-4314-a059-b58aad91c605 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:14.294315825 +0000 UTC m=+6.234490270 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7bbjs" (UniqueName: "kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs") pod "network-check-target-l79sj" (UID: "774a8870-9d9e-4314-a059-b58aad91c605") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:12.574828 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:12.574735 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:12.574828 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:12.574780 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:12.575335 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:12.574919 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctn87" podUID="e9cbb1cc-dcfc-4dac-99b7-8363fbef7774" Apr 23 13:32:12.575335 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:12.575024 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l79sj" podUID="774a8870-9d9e-4314-a059-b58aad91c605" Apr 23 13:32:12.632006 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:12.631971 2581 generic.go:358] "Generic (PLEG): container finished" podID="c89c3d2b9be5aaa6a987cf4d08bdee6e" containerID="a31aead9f08970a386ff12076765e8c9f659efedd864b2c4916d76faf4fdf79b" exitCode=0 Apr 23 13:32:12.632169 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:12.632048 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal" event={"ID":"c89c3d2b9be5aaa6a987cf4d08bdee6e","Type":"ContainerDied","Data":"a31aead9f08970a386ff12076765e8c9f659efedd864b2c4916d76faf4fdf79b"} Apr 23 13:32:12.652113 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:12.652063 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-103.ec2.internal" podStartSLOduration=3.652047894 podStartE2EDuration="3.652047894s" podCreationTimestamp="2026-04-23 13:32:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:32:11.637890154 +0000 UTC m=+3.578064616" watchObservedRunningTime="2026-04-23 13:32:12.652047894 +0000 UTC m=+4.592222353" Apr 23 13:32:13.662975 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:13.662940 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal" event={"ID":"c89c3d2b9be5aaa6a987cf4d08bdee6e","Type":"ContainerStarted","Data":"ece7a46848285c09f5e9f4bdbca4598a574ad8014b1e13ed678c936d78b3f7a8"} Apr 23 13:32:14.209399 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:14.209359 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs\") pod \"network-metrics-daemon-ctn87\" (UID: \"e9cbb1cc-dcfc-4dac-99b7-8363fbef7774\") " pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:14.209593 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:14.209518 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:14.209660 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:14.209604 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs podName:e9cbb1cc-dcfc-4dac-99b7-8363fbef7774 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:18.209583191 +0000 UTC m=+10.149757650 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs") pod "network-metrics-daemon-ctn87" (UID: "e9cbb1cc-dcfc-4dac-99b7-8363fbef7774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:14.310813 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:14.310774 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bbjs\" (UniqueName: \"kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs\") pod \"network-check-target-l79sj\" (UID: \"774a8870-9d9e-4314-a059-b58aad91c605\") " pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:14.311000 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:14.310973 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:14.311072 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:14.311008 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:14.311072 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:14.311023 2581 projected.go:194] Error preparing data for projected volume kube-api-access-7bbjs for pod openshift-network-diagnostics/network-check-target-l79sj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:14.311170 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:14.311086 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs podName:774a8870-9d9e-4314-a059-b58aad91c605 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:18.311065056 +0000 UTC m=+10.251239500 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7bbjs" (UniqueName: "kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs") pod "network-check-target-l79sj" (UID: "774a8870-9d9e-4314-a059-b58aad91c605") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:14.575506 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:14.574876 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:14.575506 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:14.574978 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctn87" podUID="e9cbb1cc-dcfc-4dac-99b7-8363fbef7774" Apr 23 13:32:14.575506 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:14.575297 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:14.575506 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:14.575405 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l79sj" podUID="774a8870-9d9e-4314-a059-b58aad91c605" Apr 23 13:32:16.574824 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:16.574792 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:16.575266 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:16.574925 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctn87" podUID="e9cbb1cc-dcfc-4dac-99b7-8363fbef7774" Apr 23 13:32:16.575358 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:16.575312 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:16.575431 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:16.575408 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l79sj" podUID="774a8870-9d9e-4314-a059-b58aad91c605" Apr 23 13:32:18.245812 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:18.245765 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs\") pod \"network-metrics-daemon-ctn87\" (UID: \"e9cbb1cc-dcfc-4dac-99b7-8363fbef7774\") " pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:18.246220 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:18.245921 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:18.246220 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:18.245985 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs podName:e9cbb1cc-dcfc-4dac-99b7-8363fbef7774 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:26.245964113 +0000 UTC m=+18.186138573 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs") pod "network-metrics-daemon-ctn87" (UID: "e9cbb1cc-dcfc-4dac-99b7-8363fbef7774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:18.346825 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:18.346785 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bbjs\" (UniqueName: \"kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs\") pod \"network-check-target-l79sj\" (UID: \"774a8870-9d9e-4314-a059-b58aad91c605\") " pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:18.347032 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:18.346950 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:18.347032 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:18.346975 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:18.347032 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:18.346990 2581 projected.go:194] Error preparing data for projected volume kube-api-access-7bbjs for pod openshift-network-diagnostics/network-check-target-l79sj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:18.347194 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:18.347037 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs podName:774a8870-9d9e-4314-a059-b58aad91c605 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:26.34702508 +0000 UTC m=+18.287199518 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7bbjs" (UniqueName: "kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs") pod "network-check-target-l79sj" (UID: "774a8870-9d9e-4314-a059-b58aad91c605") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:18.576061 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:18.575355 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:18.576061 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:18.575488 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctn87" podUID="e9cbb1cc-dcfc-4dac-99b7-8363fbef7774" Apr 23 13:32:18.576061 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:18.575601 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:18.576061 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:18.575675 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l79sj" podUID="774a8870-9d9e-4314-a059-b58aad91c605" Apr 23 13:32:20.573206 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:20.573172 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:20.573206 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:20.573202 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:20.573648 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:20.573311 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctn87" podUID="e9cbb1cc-dcfc-4dac-99b7-8363fbef7774" Apr 23 13:32:20.573648 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:20.573405 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l79sj" podUID="774a8870-9d9e-4314-a059-b58aad91c605" Apr 23 13:32:22.573404 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:22.573368 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:22.573829 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:22.573370 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:22.573829 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:22.573489 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l79sj" podUID="774a8870-9d9e-4314-a059-b58aad91c605" Apr 23 13:32:22.573829 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:22.573576 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctn87" podUID="e9cbb1cc-dcfc-4dac-99b7-8363fbef7774" Apr 23 13:32:24.572563 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:24.572515 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:24.573057 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:24.572516 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:24.573057 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:24.572667 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l79sj" podUID="774a8870-9d9e-4314-a059-b58aad91c605" Apr 23 13:32:24.573057 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:24.572754 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctn87" podUID="e9cbb1cc-dcfc-4dac-99b7-8363fbef7774" Apr 23 13:32:26.309460 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:26.309424 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs\") pod \"network-metrics-daemon-ctn87\" (UID: \"e9cbb1cc-dcfc-4dac-99b7-8363fbef7774\") " pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:26.309898 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:26.309564 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:26.309898 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:26.309618 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs podName:e9cbb1cc-dcfc-4dac-99b7-8363fbef7774 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:42.309604467 +0000 UTC m=+34.249778904 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs") pod "network-metrics-daemon-ctn87" (UID: "e9cbb1cc-dcfc-4dac-99b7-8363fbef7774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:26.410178 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:26.410138 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bbjs\" (UniqueName: \"kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs\") pod \"network-check-target-l79sj\" (UID: \"774a8870-9d9e-4314-a059-b58aad91c605\") " pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:26.410327 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:26.410280 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:26.410327 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:26.410304 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:26.410327 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:26.410316 2581 projected.go:194] Error preparing data for projected volume kube-api-access-7bbjs for pod openshift-network-diagnostics/network-check-target-l79sj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:26.410492 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:26.410378 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs podName:774a8870-9d9e-4314-a059-b58aad91c605 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:42.410358815 +0000 UTC m=+34.350533256 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7bbjs" (UniqueName: "kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs") pod "network-check-target-l79sj" (UID: "774a8870-9d9e-4314-a059-b58aad91c605") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:26.573073 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:26.572997 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:26.573213 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:26.573142 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctn87" podUID="e9cbb1cc-dcfc-4dac-99b7-8363fbef7774" Apr 23 13:32:26.573213 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:26.573205 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:26.573307 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:26.573286 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l79sj" podUID="774a8870-9d9e-4314-a059-b58aad91c605" Apr 23 13:32:26.825276 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:26.825192 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-103.ec2.internal" podStartSLOduration=17.825154301 podStartE2EDuration="17.825154301s" podCreationTimestamp="2026-04-23 13:32:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:32:13.679973583 +0000 UTC m=+5.620148125" watchObservedRunningTime="2026-04-23 13:32:26.825154301 +0000 UTC m=+18.765328780" Apr 23 13:32:26.825876 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:26.825858 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-jwjhj"] Apr 23 13:32:26.895196 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:26.895172 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:26.895335 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:26.895247 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jwjhj" podUID="7060a6e1-461e-45bb-85bd-9609300f9b17" Apr 23 13:32:27.016262 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:27.016221 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret\") pod \"global-pull-secret-syncer-jwjhj\" (UID: \"7060a6e1-461e-45bb-85bd-9609300f9b17\") " pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:27.016411 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:27.016274 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7060a6e1-461e-45bb-85bd-9609300f9b17-kubelet-config\") pod \"global-pull-secret-syncer-jwjhj\" (UID: \"7060a6e1-461e-45bb-85bd-9609300f9b17\") " pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:27.016411 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:27.016342 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7060a6e1-461e-45bb-85bd-9609300f9b17-dbus\") pod \"global-pull-secret-syncer-jwjhj\" (UID: \"7060a6e1-461e-45bb-85bd-9609300f9b17\") " pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:27.117447 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:27.117372 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret\") pod \"global-pull-secret-syncer-jwjhj\" (UID: \"7060a6e1-461e-45bb-85bd-9609300f9b17\") " pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:27.117447 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:27.117406 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7060a6e1-461e-45bb-85bd-9609300f9b17-kubelet-config\") pod \"global-pull-secret-syncer-jwjhj\" (UID: \"7060a6e1-461e-45bb-85bd-9609300f9b17\") " pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:27.117447 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:27.117431 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7060a6e1-461e-45bb-85bd-9609300f9b17-dbus\") pod \"global-pull-secret-syncer-jwjhj\" (UID: \"7060a6e1-461e-45bb-85bd-9609300f9b17\") " pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:27.117688 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:27.117510 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7060a6e1-461e-45bb-85bd-9609300f9b17-dbus\") pod \"global-pull-secret-syncer-jwjhj\" (UID: \"7060a6e1-461e-45bb-85bd-9609300f9b17\") " pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:27.117688 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:27.117517 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:27.117688 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:27.117510 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7060a6e1-461e-45bb-85bd-9609300f9b17-kubelet-config\") pod \"global-pull-secret-syncer-jwjhj\" (UID: \"7060a6e1-461e-45bb-85bd-9609300f9b17\") " pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:27.117688 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:27.117593 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret podName:7060a6e1-461e-45bb-85bd-9609300f9b17 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:27.617578294 +0000 UTC m=+19.557752731 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret") pod "global-pull-secret-syncer-jwjhj" (UID: "7060a6e1-461e-45bb-85bd-9609300f9b17") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:27.619713 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:27.619679 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret\") pod \"global-pull-secret-syncer-jwjhj\" (UID: \"7060a6e1-461e-45bb-85bd-9609300f9b17\") " pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:27.620096 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:27.619785 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:27.620096 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:27.619836 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret podName:7060a6e1-461e-45bb-85bd-9609300f9b17 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:28.619823621 +0000 UTC m=+20.559998058 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret") pod "global-pull-secret-syncer-jwjhj" (UID: "7060a6e1-461e-45bb-85bd-9609300f9b17") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:28.575706 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:28.575678 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:28.575810 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:28.575679 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:28.575868 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:28.575799 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jwjhj" podUID="7060a6e1-461e-45bb-85bd-9609300f9b17" Apr 23 13:32:28.575868 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:28.575820 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:28.575969 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:28.575916 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctn87" podUID="e9cbb1cc-dcfc-4dac-99b7-8363fbef7774" Apr 23 13:32:28.576052 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:28.575986 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l79sj" podUID="774a8870-9d9e-4314-a059-b58aad91c605" Apr 23 13:32:28.626960 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:28.626924 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret\") pod \"global-pull-secret-syncer-jwjhj\" (UID: \"7060a6e1-461e-45bb-85bd-9609300f9b17\") " pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:28.627639 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:28.627086 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:28.627639 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:28.627136 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret podName:7060a6e1-461e-45bb-85bd-9609300f9b17 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:30.627119572 +0000 UTC m=+22.567294024 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret") pod "global-pull-secret-syncer-jwjhj" (UID: "7060a6e1-461e-45bb-85bd-9609300f9b17") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:28.693016 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:28.692969 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" event={"ID":"8eaf8674-35ae-40d6-b12b-07e254516721","Type":"ContainerStarted","Data":"06babf13cabba0474a76983ce0b16a67827dd5b1c5b4e52c2bcb3a5ba285255a"} Apr 23 13:32:28.695259 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:28.695231 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" event={"ID":"2be66145-0b39-40a6-8234-797ef2cbdb16","Type":"ContainerStarted","Data":"d114fa10f73a2db8069f2ff1c0e8f38d3f54eb879c4cc1a689eda4836498b479"} Apr 23 13:32:28.696445 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:28.696426 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9vj47" event={"ID":"8ebc3830-6349-407a-984d-4ca78ca8e182","Type":"ContainerStarted","Data":"7a5bae5438b58bf5d496c792b914c147b1d01c5298839f6334001d6d684b0366"} Apr 23 13:32:28.697827 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:28.697809 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" event={"ID":"3b32a45a-dd10-48d7-9261-50a4c50f588a","Type":"ContainerStarted","Data":"b73286b5223042cad0bcb9b1c01e0402598cf0b634c4c5fe0889b6bab8580bc8"} Apr 23 13:32:28.699394 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:28.699271 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h6brm" event={"ID":"267b4640-e7c2-4100-9c7d-2623b5ee12fd","Type":"ContainerStarted","Data":"f1d71aeec503f4858323ddfafb0428e5f459bc1effef412554095427fadefd8d"} Apr 23 13:32:28.700650 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:28.700630 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mx792" event={"ID":"01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c","Type":"ContainerStarted","Data":"31bc01364d0756046d5d2afa539f18a401ee5e33ac563a8c97aac904bcfaedbf"} Apr 23 13:32:28.701863 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:28.701842 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qgrhz" event={"ID":"b35db408-8233-451b-984c-90d90df7a815","Type":"ContainerStarted","Data":"ba3da3d3f719b3d456970f5c1bc73b6ddab0dd4fb753bd25e4ca4a85d7a93f11"} Apr 23 13:32:28.714494 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:28.714456 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9vj47" podStartSLOduration=3.6139255930000003 podStartE2EDuration="20.714445656s" podCreationTimestamp="2026-04-23 13:32:08 +0000 UTC" firstStartedPulling="2026-04-23 13:32:11.224734731 +0000 UTC m=+3.164909168" lastFinishedPulling="2026-04-23 13:32:28.325254784 +0000 UTC m=+20.265429231" observedRunningTime="2026-04-23 13:32:28.714438316 +0000 UTC m=+20.654612776" watchObservedRunningTime="2026-04-23 13:32:28.714445656 +0000 UTC m=+20.654620129" Apr 23 13:32:28.729498 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:28.729448 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qgrhz" podStartSLOduration=3.648222606 podStartE2EDuration="20.729433028s" podCreationTimestamp="2026-04-23 13:32:08 +0000 UTC" firstStartedPulling="2026-04-23 13:32:11.232692496 +0000 UTC m=+3.172866934" lastFinishedPulling="2026-04-23 13:32:28.313902919 +0000 UTC m=+20.254077356" observedRunningTime="2026-04-23 13:32:28.729049493 +0000 UTC m=+20.669223953" watchObservedRunningTime="2026-04-23 13:32:28.729433028 +0000 UTC m=+20.669607489" Apr 23 13:32:28.742827 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:28.742569 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-h6brm" podStartSLOduration=3.668216686 podStartE2EDuration="20.742553536s" podCreationTimestamp="2026-04-23 13:32:08 +0000 UTC" firstStartedPulling="2026-04-23 13:32:11.234771533 +0000 UTC m=+3.174945972" lastFinishedPulling="2026-04-23 13:32:28.309108384 +0000 UTC m=+20.249282822" observedRunningTime="2026-04-23 13:32:28.74247077 +0000 UTC m=+20.682645229" watchObservedRunningTime="2026-04-23 13:32:28.742553536 +0000 UTC m=+20.682728014" Apr 23 13:32:28.775604 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:28.775554 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-zjrsd" podStartSLOduration=3.696888499 podStartE2EDuration="20.77551682s" podCreationTimestamp="2026-04-23 13:32:08 +0000 UTC" firstStartedPulling="2026-04-23 13:32:11.230244301 +0000 UTC m=+3.170418738" lastFinishedPulling="2026-04-23 13:32:28.30887261 +0000 UTC m=+20.249047059" observedRunningTime="2026-04-23 13:32:28.762260898 +0000 UTC m=+20.702435358" watchObservedRunningTime="2026-04-23 13:32:28.77551682 +0000 UTC m=+20.715691279" Apr 23 13:32:28.775784 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:28.775753 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mx792" podStartSLOduration=3.6995115309999997 podStartE2EDuration="20.77574518s" podCreationTimestamp="2026-04-23 13:32:08 +0000 UTC" firstStartedPulling="2026-04-23 13:32:11.232650682 +0000 UTC m=+3.172825120" lastFinishedPulling="2026-04-23 13:32:28.308884327 +0000 UTC m=+20.249058769" observedRunningTime="2026-04-23 13:32:28.774882794 +0000 UTC m=+20.715057252" watchObservedRunningTime="2026-04-23 13:32:28.77574518 +0000 UTC m=+20.715919650" Apr 23 13:32:29.705648 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:29.705627 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/ovn-acl-logging/0.log" Apr 23 13:32:29.705993 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:29.705946 2581 generic.go:358] "Generic (PLEG): container finished" podID="8eaf8674-35ae-40d6-b12b-07e254516721" containerID="20bd556e77fc2ba644e15466153834428e6b04ba13beb2113748737fc908a197" exitCode=1 Apr 23 13:32:29.706057 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:29.706025 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" event={"ID":"8eaf8674-35ae-40d6-b12b-07e254516721","Type":"ContainerStarted","Data":"f06b9335f971b4f128b35c3b6e374d3357333630bd7047c01b6ec1273a6ea068"} Apr 23 13:32:29.706057 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:29.706052 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" event={"ID":"8eaf8674-35ae-40d6-b12b-07e254516721","Type":"ContainerStarted","Data":"51a3ec5d8a6e39bcb111e956ba565ab0dbe1df2302009faa8be21ee4a955a548"} Apr 23 13:32:29.706150 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:29.706063 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" event={"ID":"8eaf8674-35ae-40d6-b12b-07e254516721","Type":"ContainerStarted","Data":"a96ba58d1098b7eba8b2b959eb9a07a8956344513ab5941384b04cddf2f5c7af"} Apr 23 13:32:29.706150 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:29.706072 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" event={"ID":"8eaf8674-35ae-40d6-b12b-07e254516721","Type":"ContainerStarted","Data":"6f77c527fda4d3c2d3f94e2ea856fdecd757a59bdccc64c00e709874074b05b8"} Apr 23 13:32:29.706150 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:29.706082 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" event={"ID":"8eaf8674-35ae-40d6-b12b-07e254516721","Type":"ContainerDied","Data":"20bd556e77fc2ba644e15466153834428e6b04ba13beb2113748737fc908a197"} Apr 23 13:32:29.707383 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:29.707361 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-msldf" event={"ID":"50b7daad-20fe-4160-ba67-2e5371f39d68","Type":"ContainerStarted","Data":"15744bd2c9d8513be35e2303b0f394d14ceb0726005126db2d75bf1405e2ca05"} Apr 23 13:32:29.708770 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:29.708750 2581 generic.go:358] "Generic (PLEG): container finished" podID="fe05d38e-d020-46dd-95d4-832fb5c93359" containerID="d9c057da575ad0036fc7c85f8e17cbd0bb9fd94894aaa5bc472f010c32511219" exitCode=0 Apr 23 13:32:29.708944 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:29.708903 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwqk2" event={"ID":"fe05d38e-d020-46dd-95d4-832fb5c93359","Type":"ContainerDied","Data":"d9c057da575ad0036fc7c85f8e17cbd0bb9fd94894aaa5bc472f010c32511219"} Apr 23 13:32:29.721587 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:29.721515 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-msldf" podStartSLOduration=4.646091227 podStartE2EDuration="21.721502163s" podCreationTimestamp="2026-04-23 13:32:08 +0000 UTC" firstStartedPulling="2026-04-23 13:32:11.233744617 +0000 UTC m=+3.173919058" lastFinishedPulling="2026-04-23 13:32:28.309155554 +0000 UTC m=+20.249329994" observedRunningTime="2026-04-23 13:32:29.721083765 +0000 UTC m=+21.661258224" watchObservedRunningTime="2026-04-23 13:32:29.721502163 +0000 UTC m=+21.661676623" Apr 23 13:32:29.841930 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:29.841816 2581 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 13:32:29.987492 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:29.987473 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-h6brm" Apr 23 13:32:29.988031 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:29.988009 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-h6brm" Apr 23 13:32:30.539264 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:30.539162 2581 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T13:32:29.84192621Z","UUID":"6d182f36-de2d-4bf6-881f-6e1c43c99705","Handler":null,"Name":"","Endpoint":""} Apr 23 13:32:30.542863 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:30.542838 2581 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 13:32:30.543004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:30.542872 2581 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 13:32:30.572843 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:30.572469 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:30.572843 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:30.572482 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:30.572843 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:30.572590 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jwjhj" podUID="7060a6e1-461e-45bb-85bd-9609300f9b17" Apr 23 13:32:30.572843 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:30.572658 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:30.572843 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:30.572732 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctn87" podUID="e9cbb1cc-dcfc-4dac-99b7-8363fbef7774" Apr 23 13:32:30.572843 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:30.572788 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l79sj" podUID="774a8870-9d9e-4314-a059-b58aad91c605" Apr 23 13:32:30.640913 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:30.640885 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret\") pod \"global-pull-secret-syncer-jwjhj\" (UID: \"7060a6e1-461e-45bb-85bd-9609300f9b17\") " pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:30.641045 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:30.641021 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:30.641105 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:30.641070 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret podName:7060a6e1-461e-45bb-85bd-9609300f9b17 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:34.641057774 +0000 UTC m=+26.581232211 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret") pod "global-pull-secret-syncer-jwjhj" (UID: "7060a6e1-461e-45bb-85bd-9609300f9b17") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:30.712034 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:30.711945 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" event={"ID":"2be66145-0b39-40a6-8234-797ef2cbdb16","Type":"ContainerStarted","Data":"44f8f477d5c1dd0f3fc92b1983ece827b74702798a83a4fe260719751f48da2e"} Apr 23 13:32:31.717914 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:31.717884 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/ovn-acl-logging/0.log" Apr 23 13:32:31.718634 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:31.718279 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" event={"ID":"8eaf8674-35ae-40d6-b12b-07e254516721","Type":"ContainerStarted","Data":"9278fea18f19110a100fcbeb136fc97b45f13eb4cd568111c790955cf6012aec"} Apr 23 13:32:31.720566 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:31.720521 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" event={"ID":"2be66145-0b39-40a6-8234-797ef2cbdb16","Type":"ContainerStarted","Data":"1d59bbc88cacada91b450f3f328c1fc928371e2d3f19e19d2cb5ee41417b34bd"} Apr 23 13:32:31.720675 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:31.720580 2581 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 13:32:31.738391 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:31.738346 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gktds" podStartSLOduration=3.674594348 podStartE2EDuration="23.738330593s" podCreationTimestamp="2026-04-23 13:32:08 +0000 UTC" firstStartedPulling="2026-04-23 13:32:11.2274124 +0000 UTC m=+3.167586845" lastFinishedPulling="2026-04-23 13:32:31.29114865 +0000 UTC m=+23.231323090" observedRunningTime="2026-04-23 13:32:31.738263478 +0000 UTC m=+23.678437937" watchObservedRunningTime="2026-04-23 13:32:31.738330593 +0000 UTC m=+23.678505053" Apr 23 13:32:32.573249 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:32.573214 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:32.573411 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:32.573355 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:32.573411 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:32.573358 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctn87" podUID="e9cbb1cc-dcfc-4dac-99b7-8363fbef7774" Apr 23 13:32:32.573548 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:32.573430 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jwjhj" podUID="7060a6e1-461e-45bb-85bd-9609300f9b17" Apr 23 13:32:32.573548 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:32.573472 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:32.573653 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:32.573553 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l79sj" podUID="774a8870-9d9e-4314-a059-b58aad91c605" Apr 23 13:32:34.573120 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:34.572882 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:34.573593 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:34.572890 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:34.573593 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:34.573213 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l79sj" podUID="774a8870-9d9e-4314-a059-b58aad91c605" Apr 23 13:32:34.573593 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:34.572891 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:34.573593 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:34.573280 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctn87" podUID="e9cbb1cc-dcfc-4dac-99b7-8363fbef7774" Apr 23 13:32:34.573593 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:34.573353 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jwjhj" podUID="7060a6e1-461e-45bb-85bd-9609300f9b17" Apr 23 13:32:34.672300 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:34.672270 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret\") pod \"global-pull-secret-syncer-jwjhj\" (UID: \"7060a6e1-461e-45bb-85bd-9609300f9b17\") " pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:34.672426 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:34.672356 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:34.672426 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:34.672402 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret podName:7060a6e1-461e-45bb-85bd-9609300f9b17 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:42.672388152 +0000 UTC m=+34.612562589 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret") pod "global-pull-secret-syncer-jwjhj" (UID: "7060a6e1-461e-45bb-85bd-9609300f9b17") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:34.728858 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:34.728831 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/ovn-acl-logging/0.log" Apr 23 13:32:34.729220 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:34.729201 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" event={"ID":"8eaf8674-35ae-40d6-b12b-07e254516721","Type":"ContainerStarted","Data":"042398fcfdb64762af96b75535186d101334071f3417d913193aa6c9bf3b1994"} Apr 23 13:32:34.729502 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:34.729477 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:34.729502 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:34.729504 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:34.729698 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:34.729517 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:34.729698 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:34.729656 2581 scope.go:117] "RemoveContainer" containerID="20bd556e77fc2ba644e15466153834428e6b04ba13beb2113748737fc908a197" Apr 23 13:32:34.730870 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:34.730849 2581 generic.go:358] "Generic (PLEG): container finished" podID="fe05d38e-d020-46dd-95d4-832fb5c93359" containerID="cf620d99d646f5850c374b08cdea2a1e2f487e166f697fb77d352b03e3b1a466" exitCode=0 Apr 23 13:32:34.730937 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:34.730887 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwqk2" event={"ID":"fe05d38e-d020-46dd-95d4-832fb5c93359","Type":"ContainerDied","Data":"cf620d99d646f5850c374b08cdea2a1e2f487e166f697fb77d352b03e3b1a466"} Apr 23 13:32:34.745179 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:34.745155 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:34.746077 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:34.746059 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:32:35.672596 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:35.672373 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jwjhj"] Apr 23 13:32:35.673063 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:35.672632 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:35.673063 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:35.672713 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jwjhj" podUID="7060a6e1-461e-45bb-85bd-9609300f9b17" Apr 23 13:32:35.686577 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:35.686557 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-l79sj"] Apr 23 13:32:35.686690 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:35.686641 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:35.686729 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:35.686705 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l79sj" podUID="774a8870-9d9e-4314-a059-b58aad91c605" Apr 23 13:32:35.687065 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:35.687048 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ctn87"] Apr 23 13:32:35.687146 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:35.687135 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:35.687252 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:35.687231 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctn87" podUID="e9cbb1cc-dcfc-4dac-99b7-8363fbef7774" Apr 23 13:32:35.734496 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:35.734467 2581 generic.go:358] "Generic (PLEG): container finished" podID="fe05d38e-d020-46dd-95d4-832fb5c93359" containerID="20162fe3a0f8b541e59940f81396c02b13898f63bdacf80f5c8130d107062642" exitCode=0 Apr 23 13:32:35.734673 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:35.734563 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwqk2" event={"ID":"fe05d38e-d020-46dd-95d4-832fb5c93359","Type":"ContainerDied","Data":"20162fe3a0f8b541e59940f81396c02b13898f63bdacf80f5c8130d107062642"} Apr 23 13:32:35.737626 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:35.737606 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/ovn-acl-logging/0.log" Apr 23 13:32:35.737923 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:35.737901 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" event={"ID":"8eaf8674-35ae-40d6-b12b-07e254516721","Type":"ContainerStarted","Data":"d059d602797035d39986ac59781699778ee4879dd741c5148a0b59ea3691583c"} Apr 23 13:32:35.782012 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:35.781974 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" podStartSLOduration=10.459601259 podStartE2EDuration="27.781962478s" podCreationTimestamp="2026-04-23 13:32:08 +0000 UTC" firstStartedPulling="2026-04-23 13:32:11.234731681 +0000 UTC m=+3.174906118" lastFinishedPulling="2026-04-23 13:32:28.557092899 +0000 UTC m=+20.497267337" observedRunningTime="2026-04-23 13:32:35.78184027 +0000 UTC m=+27.722014728" watchObservedRunningTime="2026-04-23 13:32:35.781962478 +0000 UTC m=+27.722136994" Apr 23 13:32:36.742260 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:36.742231 2581 generic.go:358] "Generic (PLEG): container finished" podID="fe05d38e-d020-46dd-95d4-832fb5c93359" containerID="d04100bbb346ee0e7ef7c2231662b85d755960a09baab6a9fbd588f0f4248c33" exitCode=0 Apr 23 13:32:36.742661 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:36.742308 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwqk2" event={"ID":"fe05d38e-d020-46dd-95d4-832fb5c93359","Type":"ContainerDied","Data":"d04100bbb346ee0e7ef7c2231662b85d755960a09baab6a9fbd588f0f4248c33"} Apr 23 13:32:37.572691 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:37.572655 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:37.572894 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:37.572655 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:37.572894 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:37.572785 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jwjhj" podUID="7060a6e1-461e-45bb-85bd-9609300f9b17" Apr 23 13:32:37.572894 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:37.572871 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l79sj" podUID="774a8870-9d9e-4314-a059-b58aad91c605" Apr 23 13:32:37.573069 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:37.572658 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:37.573069 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:37.572989 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctn87" podUID="e9cbb1cc-dcfc-4dac-99b7-8363fbef7774" Apr 23 13:32:39.573066 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:39.573018 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:39.573787 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:39.573151 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:39.573787 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:39.573164 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l79sj" podUID="774a8870-9d9e-4314-a059-b58aad91c605" Apr 23 13:32:39.573787 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:39.573258 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jwjhj" podUID="7060a6e1-461e-45bb-85bd-9609300f9b17" Apr 23 13:32:39.573787 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:39.573310 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:39.573787 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:39.573396 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctn87" podUID="e9cbb1cc-dcfc-4dac-99b7-8363fbef7774" Apr 23 13:32:39.635213 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:39.635175 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-h6brm" Apr 23 13:32:39.635395 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:39.635319 2581 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 13:32:39.635892 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:39.635872 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-h6brm" Apr 23 13:32:41.359340 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.359264 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-103.ec2.internal" event="NodeReady" Apr 23 13:32:41.359827 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.359420 2581 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 13:32:41.410597 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.410559 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lk6nc"] Apr 23 13:32:41.415891 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.415855 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-slr9t"] Apr 23 13:32:41.416070 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.416047 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lk6nc" Apr 23 13:32:41.418310 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.418286 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vtx8h\"" Apr 23 13:32:41.418454 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.418287 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 13:32:41.418636 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.418622 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 13:32:41.419742 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.419719 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-slr9t" Apr 23 13:32:41.421815 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.421793 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 13:32:41.421972 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.421845 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-8bjdp\"" Apr 23 13:32:41.421972 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.421925 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 13:32:41.422125 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.422054 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 13:32:41.425597 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.425374 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-slr9t"] Apr 23 13:32:41.427516 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.427496 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lk6nc"] Apr 23 13:32:41.523155 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.523118 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfv9f\" (UniqueName: \"kubernetes.io/projected/58b54dde-cfd8-43cb-8a0f-80463679527c-kube-api-access-kfv9f\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:32:41.523155 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.523156 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwlxh\" (UniqueName: \"kubernetes.io/projected/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-kube-api-access-zwlxh\") pod \"ingress-canary-slr9t\" (UID: \"9d0ae972-b2b8-41fe-a688-e9f33be2d8f1\") " pod="openshift-ingress-canary/ingress-canary-slr9t" Apr 23 13:32:41.523398 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.523179 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/58b54dde-cfd8-43cb-8a0f-80463679527c-tmp-dir\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:32:41.523398 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.523309 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58b54dde-cfd8-43cb-8a0f-80463679527c-config-volume\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:32:41.523398 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.523343 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:32:41.523398 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.523363 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert\") pod \"ingress-canary-slr9t\" (UID: \"9d0ae972-b2b8-41fe-a688-e9f33be2d8f1\") " pod="openshift-ingress-canary/ingress-canary-slr9t" Apr 23 13:32:41.572601 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.572564 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:41.572601 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.572601 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:41.572946 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.572925 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:41.575400 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.575375 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:32:41.575400 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.575395 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-b4pgl\"" Apr 23 13:32:41.575400 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.575409 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rl8cq\"" Apr 23 13:32:41.575684 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.575505 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:32:41.575782 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.575764 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 13:32:41.575895 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.575767 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:32:41.624745 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.624626 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/58b54dde-cfd8-43cb-8a0f-80463679527c-tmp-dir\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:32:41.624745 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.624693 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58b54dde-cfd8-43cb-8a0f-80463679527c-config-volume\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:32:41.624745 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.624728 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:32:41.625029 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.624761 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert\") pod \"ingress-canary-slr9t\" (UID: \"9d0ae972-b2b8-41fe-a688-e9f33be2d8f1\") " pod="openshift-ingress-canary/ingress-canary-slr9t" Apr 23 13:32:41.625029 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.624834 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfv9f\" (UniqueName: \"kubernetes.io/projected/58b54dde-cfd8-43cb-8a0f-80463679527c-kube-api-access-kfv9f\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:32:41.625029 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.624862 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwlxh\" (UniqueName: \"kubernetes.io/projected/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-kube-api-access-zwlxh\") pod \"ingress-canary-slr9t\" (UID: \"9d0ae972-b2b8-41fe-a688-e9f33be2d8f1\") " pod="openshift-ingress-canary/ingress-canary-slr9t" Apr 23 13:32:41.625029 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:41.624906 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:41.625029 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:41.624938 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:41.625029 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:41.624983 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls podName:58b54dde-cfd8-43cb-8a0f-80463679527c nodeName:}" failed. No retries permitted until 2026-04-23 13:32:42.124960873 +0000 UTC m=+34.065135314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls") pod "dns-default-lk6nc" (UID: "58b54dde-cfd8-43cb-8a0f-80463679527c") : secret "dns-default-metrics-tls" not found Apr 23 13:32:41.625029 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:41.625028 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert podName:9d0ae972-b2b8-41fe-a688-e9f33be2d8f1 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:42.125008944 +0000 UTC m=+34.065183383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert") pod "ingress-canary-slr9t" (UID: "9d0ae972-b2b8-41fe-a688-e9f33be2d8f1") : secret "canary-serving-cert" not found Apr 23 13:32:41.625317 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.625063 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/58b54dde-cfd8-43cb-8a0f-80463679527c-tmp-dir\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:32:41.625317 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.625274 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58b54dde-cfd8-43cb-8a0f-80463679527c-config-volume\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:32:41.636073 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.636045 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfv9f\" (UniqueName: \"kubernetes.io/projected/58b54dde-cfd8-43cb-8a0f-80463679527c-kube-api-access-kfv9f\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:32:41.636268 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:41.636248 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwlxh\" (UniqueName: \"kubernetes.io/projected/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-kube-api-access-zwlxh\") pod \"ingress-canary-slr9t\" (UID: \"9d0ae972-b2b8-41fe-a688-e9f33be2d8f1\") " pod="openshift-ingress-canary/ingress-canary-slr9t" Apr 23 13:32:42.128707 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:42.128665 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:32:42.128889 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:42.128722 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert\") pod \"ingress-canary-slr9t\" (UID: \"9d0ae972-b2b8-41fe-a688-e9f33be2d8f1\") " pod="openshift-ingress-canary/ingress-canary-slr9t" Apr 23 13:32:42.128889 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:42.128839 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:42.128974 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:42.128920 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls podName:58b54dde-cfd8-43cb-8a0f-80463679527c nodeName:}" failed. No retries permitted until 2026-04-23 13:32:43.12890182 +0000 UTC m=+35.069076261 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls") pod "dns-default-lk6nc" (UID: "58b54dde-cfd8-43cb-8a0f-80463679527c") : secret "dns-default-metrics-tls" not found Apr 23 13:32:42.128974 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:42.128839 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:42.128974 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:42.128950 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert podName:9d0ae972-b2b8-41fe-a688-e9f33be2d8f1 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:43.128944591 +0000 UTC m=+35.069119032 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert") pod "ingress-canary-slr9t" (UID: "9d0ae972-b2b8-41fe-a688-e9f33be2d8f1") : secret "canary-serving-cert" not found Apr 23 13:32:42.329595 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:42.329550 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs\") pod \"network-metrics-daemon-ctn87\" (UID: \"e9cbb1cc-dcfc-4dac-99b7-8363fbef7774\") " pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:32:42.329794 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:42.329686 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:32:42.329794 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:42.329768 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs podName:e9cbb1cc-dcfc-4dac-99b7-8363fbef7774 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:14.329751201 +0000 UTC m=+66.269925643 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs") pod "network-metrics-daemon-ctn87" (UID: "e9cbb1cc-dcfc-4dac-99b7-8363fbef7774") : secret "metrics-daemon-secret" not found Apr 23 13:32:42.431004 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:42.430960 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bbjs\" (UniqueName: \"kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs\") pod \"network-check-target-l79sj\" (UID: \"774a8870-9d9e-4314-a059-b58aad91c605\") " pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:42.433630 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:42.433601 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bbjs\" (UniqueName: \"kubernetes.io/projected/774a8870-9d9e-4314-a059-b58aad91c605-kube-api-access-7bbjs\") pod \"network-check-target-l79sj\" (UID: \"774a8870-9d9e-4314-a059-b58aad91c605\") " pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:42.484235 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:42.484201 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:42.732686 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:42.732641 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret\") pod \"global-pull-secret-syncer-jwjhj\" (UID: \"7060a6e1-461e-45bb-85bd-9609300f9b17\") " pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:42.738459 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:42.738432 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7060a6e1-461e-45bb-85bd-9609300f9b17-original-pull-secret\") pod \"global-pull-secret-syncer-jwjhj\" (UID: \"7060a6e1-461e-45bb-85bd-9609300f9b17\") " pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:42.791251 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:42.791220 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwjhj" Apr 23 13:32:42.800978 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:42.800952 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-l79sj"] Apr 23 13:32:42.805558 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:42.805465 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod774a8870_9d9e_4314_a059_b58aad91c605.slice/crio-a6f05b0684cf4c44d4244664c44fbb5fe4f0734124950edcb03bacd30e445af5 WatchSource:0}: Error finding container a6f05b0684cf4c44d4244664c44fbb5fe4f0734124950edcb03bacd30e445af5: Status 404 returned error can't find the container with id a6f05b0684cf4c44d4244664c44fbb5fe4f0734124950edcb03bacd30e445af5 Apr 23 13:32:42.914750 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:42.914567 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jwjhj"] Apr 23 13:32:42.918265 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:42.918230 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7060a6e1_461e_45bb_85bd_9609300f9b17.slice/crio-ec445a220ed91a2eed083b2b11515d15cfe5950fe5b8ba0214df1df4188a23ba WatchSource:0}: Error finding container ec445a220ed91a2eed083b2b11515d15cfe5950fe5b8ba0214df1df4188a23ba: Status 404 returned error can't find the container with id ec445a220ed91a2eed083b2b11515d15cfe5950fe5b8ba0214df1df4188a23ba Apr 23 13:32:43.135662 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:43.135627 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:32:43.135876 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:43.135668 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert\") pod \"ingress-canary-slr9t\" (UID: \"9d0ae972-b2b8-41fe-a688-e9f33be2d8f1\") " pod="openshift-ingress-canary/ingress-canary-slr9t" Apr 23 13:32:43.135876 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:43.135764 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:43.135876 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:43.135766 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:43.135876 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:43.135815 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert podName:9d0ae972-b2b8-41fe-a688-e9f33be2d8f1 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:45.135801891 +0000 UTC m=+37.075976328 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert") pod "ingress-canary-slr9t" (UID: "9d0ae972-b2b8-41fe-a688-e9f33be2d8f1") : secret "canary-serving-cert" not found Apr 23 13:32:43.135876 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:43.135832 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls podName:58b54dde-cfd8-43cb-8a0f-80463679527c nodeName:}" failed. No retries permitted until 2026-04-23 13:32:45.135823661 +0000 UTC m=+37.075998097 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls") pod "dns-default-lk6nc" (UID: "58b54dde-cfd8-43cb-8a0f-80463679527c") : secret "dns-default-metrics-tls" not found Apr 23 13:32:43.759996 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:43.759963 2581 generic.go:358] "Generic (PLEG): container finished" podID="fe05d38e-d020-46dd-95d4-832fb5c93359" containerID="4c57d8cb34ac9b91a1f06b38b03ecdf4f6f59925efff23ae25266e31d74a1b2c" exitCode=0 Apr 23 13:32:43.760406 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:43.760055 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwqk2" event={"ID":"fe05d38e-d020-46dd-95d4-832fb5c93359","Type":"ContainerDied","Data":"4c57d8cb34ac9b91a1f06b38b03ecdf4f6f59925efff23ae25266e31d74a1b2c"} Apr 23 13:32:43.761259 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:43.761167 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jwjhj" event={"ID":"7060a6e1-461e-45bb-85bd-9609300f9b17","Type":"ContainerStarted","Data":"ec445a220ed91a2eed083b2b11515d15cfe5950fe5b8ba0214df1df4188a23ba"} Apr 23 13:32:43.762085 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:43.762053 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-l79sj" event={"ID":"774a8870-9d9e-4314-a059-b58aad91c605","Type":"ContainerStarted","Data":"a6f05b0684cf4c44d4244664c44fbb5fe4f0734124950edcb03bacd30e445af5"} Apr 23 13:32:44.768379 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:44.768338 2581 generic.go:358] "Generic (PLEG): container finished" podID="fe05d38e-d020-46dd-95d4-832fb5c93359" containerID="7122ef1c0f583dd6d2de4dae67473ceced15c44983f991f97474744af6ec5bf5" exitCode=0 Apr 23 13:32:44.770479 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:44.768410 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwqk2" event={"ID":"fe05d38e-d020-46dd-95d4-832fb5c93359","Type":"ContainerDied","Data":"7122ef1c0f583dd6d2de4dae67473ceced15c44983f991f97474744af6ec5bf5"} Apr 23 13:32:45.152347 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:45.151743 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:32:45.152347 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:45.151798 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert\") pod \"ingress-canary-slr9t\" (UID: \"9d0ae972-b2b8-41fe-a688-e9f33be2d8f1\") " pod="openshift-ingress-canary/ingress-canary-slr9t" Apr 23 13:32:45.152347 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:45.151919 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:45.152347 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:45.151922 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:45.152347 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:45.151981 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert podName:9d0ae972-b2b8-41fe-a688-e9f33be2d8f1 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:49.151961903 +0000 UTC m=+41.092136341 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert") pod "ingress-canary-slr9t" (UID: "9d0ae972-b2b8-41fe-a688-e9f33be2d8f1") : secret "canary-serving-cert" not found Apr 23 13:32:45.152347 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:45.151997 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls podName:58b54dde-cfd8-43cb-8a0f-80463679527c nodeName:}" failed. No retries permitted until 2026-04-23 13:32:49.15199141 +0000 UTC m=+41.092165847 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls") pod "dns-default-lk6nc" (UID: "58b54dde-cfd8-43cb-8a0f-80463679527c") : secret "dns-default-metrics-tls" not found Apr 23 13:32:45.775337 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:45.775298 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwqk2" event={"ID":"fe05d38e-d020-46dd-95d4-832fb5c93359","Type":"ContainerStarted","Data":"900f8cb36b06a45fdc4f723d01d95c5a99cb29b4eb94c7fe238c61259e34f2f2"} Apr 23 13:32:45.800263 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:45.800204 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vwqk2" podStartSLOduration=6.410335491 podStartE2EDuration="37.800184178s" podCreationTimestamp="2026-04-23 13:32:08 +0000 UTC" firstStartedPulling="2026-04-23 13:32:11.229736478 +0000 UTC m=+3.169910914" lastFinishedPulling="2026-04-23 13:32:42.619585161 +0000 UTC m=+34.559759601" observedRunningTime="2026-04-23 13:32:45.799979702 +0000 UTC m=+37.740154174" watchObservedRunningTime="2026-04-23 13:32:45.800184178 +0000 UTC m=+37.740358638" Apr 23 13:32:48.782207 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:48.782165 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jwjhj" event={"ID":"7060a6e1-461e-45bb-85bd-9609300f9b17","Type":"ContainerStarted","Data":"8cec7bd3f8c54b9a0c3d3739fade18708c7d0c648bfa5bc8d87a8e38219f4197"} Apr 23 13:32:48.783580 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:48.783544 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-l79sj" event={"ID":"774a8870-9d9e-4314-a059-b58aad91c605","Type":"ContainerStarted","Data":"feb67b4355a4d9bc957cbc86f34434e1f23d736da373805d27c05d5d4f42f2aa"} Apr 23 13:32:48.783699 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:48.783606 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:32:48.803650 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:48.803600 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-jwjhj" podStartSLOduration=17.950986461 podStartE2EDuration="22.803588436s" podCreationTimestamp="2026-04-23 13:32:26 +0000 UTC" firstStartedPulling="2026-04-23 13:32:42.919883942 +0000 UTC m=+34.860058385" lastFinishedPulling="2026-04-23 13:32:47.772485921 +0000 UTC m=+39.712660360" observedRunningTime="2026-04-23 13:32:48.803566877 +0000 UTC m=+40.743741333" watchObservedRunningTime="2026-04-23 13:32:48.803588436 +0000 UTC m=+40.743762894" Apr 23 13:32:49.182564 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:49.182541 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:32:49.182668 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:49.182576 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert\") pod \"ingress-canary-slr9t\" (UID: \"9d0ae972-b2b8-41fe-a688-e9f33be2d8f1\") " pod="openshift-ingress-canary/ingress-canary-slr9t" Apr 23 13:32:49.182668 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:49.182647 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:49.182668 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:49.182660 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:49.182786 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:49.182707 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls podName:58b54dde-cfd8-43cb-8a0f-80463679527c nodeName:}" failed. No retries permitted until 2026-04-23 13:32:57.182690454 +0000 UTC m=+49.122864893 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls") pod "dns-default-lk6nc" (UID: "58b54dde-cfd8-43cb-8a0f-80463679527c") : secret "dns-default-metrics-tls" not found Apr 23 13:32:49.182786 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:49.182720 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert podName:9d0ae972-b2b8-41fe-a688-e9f33be2d8f1 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:57.182714252 +0000 UTC m=+49.122888689 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert") pod "ingress-canary-slr9t" (UID: "9d0ae972-b2b8-41fe-a688-e9f33be2d8f1") : secret "canary-serving-cert" not found Apr 23 13:32:57.230913 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:57.230859 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:32:57.230913 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:57.230901 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert\") pod \"ingress-canary-slr9t\" (UID: \"9d0ae972-b2b8-41fe-a688-e9f33be2d8f1\") " pod="openshift-ingress-canary/ingress-canary-slr9t" Apr 23 13:32:57.231479 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:57.231024 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:57.231479 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:57.231024 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:57.231479 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:57.231098 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert podName:9d0ae972-b2b8-41fe-a688-e9f33be2d8f1 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:13.231078005 +0000 UTC m=+65.171252442 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert") pod "ingress-canary-slr9t" (UID: "9d0ae972-b2b8-41fe-a688-e9f33be2d8f1") : secret "canary-serving-cert" not found Apr 23 13:32:57.231479 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:32:57.231113 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls podName:58b54dde-cfd8-43cb-8a0f-80463679527c nodeName:}" failed. No retries permitted until 2026-04-23 13:33:13.231106289 +0000 UTC m=+65.171280726 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls") pod "dns-default-lk6nc" (UID: "58b54dde-cfd8-43cb-8a0f-80463679527c") : secret "dns-default-metrics-tls" not found Apr 23 13:32:58.346923 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.346878 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-l79sj" podStartSLOduration=45.392974308 podStartE2EDuration="50.34686325s" podCreationTimestamp="2026-04-23 13:32:08 +0000 UTC" firstStartedPulling="2026-04-23 13:32:42.808117765 +0000 UTC m=+34.748292216" lastFinishedPulling="2026-04-23 13:32:47.76200671 +0000 UTC m=+39.702181158" observedRunningTime="2026-04-23 13:32:48.837106731 +0000 UTC m=+40.777281206" watchObservedRunningTime="2026-04-23 13:32:58.34686325 +0000 UTC m=+50.287037728" Apr 23 13:32:58.347480 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.347460 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694"] Apr 23 13:32:58.386149 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.386122 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694"] Apr 23 13:32:58.386243 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.386165 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.388623 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.388602 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 13:32:58.388985 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.388969 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 13:32:58.389107 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.389087 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 13:32:58.389166 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.389087 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 13:32:58.389653 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.389637 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 13:32:58.389745 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.389715 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 13:32:58.389745 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.389715 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 13:32:58.439589 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.439564 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.439687 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.439609 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-ca\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.439687 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.439632 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-hub\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.439687 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.439671 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stprb\" (UniqueName: \"kubernetes.io/projected/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-kube-api-access-stprb\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.439792 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.439711 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.439792 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.439777 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.539976 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.539951 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.540077 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.540000 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-ca\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.540077 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.540025 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-hub\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.540077 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.540051 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stprb\" (UniqueName: \"kubernetes.io/projected/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-kube-api-access-stprb\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.540256 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.540092 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.540256 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.540151 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.540832 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.540809 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.544438 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.544419 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.544554 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.544493 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-hub\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.544608 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.544576 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.544815 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.544795 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-ca\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.548578 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.548556 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stprb\" (UniqueName: \"kubernetes.io/projected/d4e483b9-2cd1-4e40-97d6-718ee57eb42b-kube-api-access-stprb\") pod \"cluster-proxy-proxy-agent-76b996fc64-lj694\" (UID: \"d4e483b9-2cd1-4e40-97d6-718ee57eb42b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.702637 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.702616 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:32:58.820923 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:58.820893 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694"] Apr 23 13:32:58.824502 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:32:58.824460 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4e483b9_2cd1_4e40_97d6_718ee57eb42b.slice/crio-6fe93e5200280519347d312ba625ad1567036c136c9df1cfb1607a30ac4a5d04 WatchSource:0}: Error finding container 6fe93e5200280519347d312ba625ad1567036c136c9df1cfb1607a30ac4a5d04: Status 404 returned error can't find the container with id 6fe93e5200280519347d312ba625ad1567036c136c9df1cfb1607a30ac4a5d04 Apr 23 13:32:59.805582 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:32:59.805542 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" event={"ID":"d4e483b9-2cd1-4e40-97d6-718ee57eb42b","Type":"ContainerStarted","Data":"6fe93e5200280519347d312ba625ad1567036c136c9df1cfb1607a30ac4a5d04"} Apr 23 13:33:06.753977 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:06.753951 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4shpw" Apr 23 13:33:07.821760 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:07.821732 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" event={"ID":"d4e483b9-2cd1-4e40-97d6-718ee57eb42b","Type":"ContainerStarted","Data":"7f97599a0013e40b88a77e3273e055411e24f662f7ff74276780127b4d76e401"} Apr 23 13:33:11.829781 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:11.829747 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" event={"ID":"d4e483b9-2cd1-4e40-97d6-718ee57eb42b","Type":"ContainerStarted","Data":"6881755f14f72a45aa090fa905714abd9870bd1760e5109f7325a955e96cea53"} Apr 23 13:33:11.830219 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:11.829785 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" event={"ID":"d4e483b9-2cd1-4e40-97d6-718ee57eb42b","Type":"ContainerStarted","Data":"e3317d77ad309edc6dba7e617b7ee963c1c703fb0518b3d5f76ac14d32cdf4fb"} Apr 23 13:33:11.848026 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:11.847984 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" podStartSLOduration=1.533457087 podStartE2EDuration="13.847973042s" podCreationTimestamp="2026-04-23 13:32:58 +0000 UTC" firstStartedPulling="2026-04-23 13:32:58.82573938 +0000 UTC m=+50.765913816" lastFinishedPulling="2026-04-23 13:33:11.140255333 +0000 UTC m=+63.080429771" observedRunningTime="2026-04-23 13:33:11.846544232 +0000 UTC m=+63.786718682" watchObservedRunningTime="2026-04-23 13:33:11.847973042 +0000 UTC m=+63.788147502" Apr 23 13:33:13.238329 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:13.238300 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:33:13.238329 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:13.238332 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert\") pod \"ingress-canary-slr9t\" (UID: \"9d0ae972-b2b8-41fe-a688-e9f33be2d8f1\") " pod="openshift-ingress-canary/ingress-canary-slr9t" Apr 23 13:33:13.238775 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:33:13.238413 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:13.238775 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:33:13.238415 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:13.238775 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:33:13.238458 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert podName:9d0ae972-b2b8-41fe-a688-e9f33be2d8f1 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:45.23844534 +0000 UTC m=+97.178619776 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert") pod "ingress-canary-slr9t" (UID: "9d0ae972-b2b8-41fe-a688-e9f33be2d8f1") : secret "canary-serving-cert" not found Apr 23 13:33:13.238775 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:33:13.238469 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls podName:58b54dde-cfd8-43cb-8a0f-80463679527c nodeName:}" failed. No retries permitted until 2026-04-23 13:33:45.238463882 +0000 UTC m=+97.178638319 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls") pod "dns-default-lk6nc" (UID: "58b54dde-cfd8-43cb-8a0f-80463679527c") : secret "dns-default-metrics-tls" not found Apr 23 13:33:14.346033 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:14.345998 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs\") pod \"network-metrics-daemon-ctn87\" (UID: \"e9cbb1cc-dcfc-4dac-99b7-8363fbef7774\") " pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:33:14.346370 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:33:14.346137 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:33:14.346370 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:33:14.346221 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs podName:e9cbb1cc-dcfc-4dac-99b7-8363fbef7774 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:18.346196588 +0000 UTC m=+130.286371026 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs") pod "network-metrics-daemon-ctn87" (UID: "e9cbb1cc-dcfc-4dac-99b7-8363fbef7774") : secret "metrics-daemon-secret" not found Apr 23 13:33:19.787385 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:19.787352 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-l79sj" Apr 23 13:33:45.246786 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:45.246753 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:33:45.247098 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:45.246792 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert\") pod \"ingress-canary-slr9t\" (UID: \"9d0ae972-b2b8-41fe-a688-e9f33be2d8f1\") " pod="openshift-ingress-canary/ingress-canary-slr9t" Apr 23 13:33:45.247098 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:33:45.246891 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:45.247098 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:33:45.246894 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:45.247098 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:33:45.246953 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert podName:9d0ae972-b2b8-41fe-a688-e9f33be2d8f1 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:49.246937168 +0000 UTC m=+161.187111606 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert") pod "ingress-canary-slr9t" (UID: "9d0ae972-b2b8-41fe-a688-e9f33be2d8f1") : secret "canary-serving-cert" not found Apr 23 13:33:45.247098 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:33:45.246966 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls podName:58b54dde-cfd8-43cb-8a0f-80463679527c nodeName:}" failed. No retries permitted until 2026-04-23 13:34:49.24696041 +0000 UTC m=+161.187134846 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls") pod "dns-default-lk6nc" (UID: "58b54dde-cfd8-43cb-8a0f-80463679527c") : secret "dns-default-metrics-tls" not found Apr 23 13:33:53.458921 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:53.458885 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lwwp4"] Apr 23 13:33:53.460675 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:53.460657 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lwwp4" Apr 23 13:33:53.463142 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:53.463119 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-p785g\"" Apr 23 13:33:53.471108 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:53.471082 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lwwp4"] Apr 23 13:33:53.502823 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:53.502802 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfl6t\" (UniqueName: \"kubernetes.io/projected/04f56a99-0f36-4cb1-bbb4-5f1009f833dc-kube-api-access-vfl6t\") pod \"network-check-source-8894fc9bd-lwwp4\" (UID: \"04f56a99-0f36-4cb1-bbb4-5f1009f833dc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lwwp4" Apr 23 13:33:53.603044 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:53.603004 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfl6t\" (UniqueName: \"kubernetes.io/projected/04f56a99-0f36-4cb1-bbb4-5f1009f833dc-kube-api-access-vfl6t\") pod \"network-check-source-8894fc9bd-lwwp4\" (UID: \"04f56a99-0f36-4cb1-bbb4-5f1009f833dc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lwwp4" Apr 23 13:33:53.611736 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:53.611711 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfl6t\" (UniqueName: \"kubernetes.io/projected/04f56a99-0f36-4cb1-bbb4-5f1009f833dc-kube-api-access-vfl6t\") pod \"network-check-source-8894fc9bd-lwwp4\" (UID: \"04f56a99-0f36-4cb1-bbb4-5f1009f833dc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lwwp4" Apr 23 13:33:53.769925 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:53.769852 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lwwp4" Apr 23 13:33:53.887816 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:53.887785 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lwwp4"] Apr 23 13:33:53.890859 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:33:53.890818 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04f56a99_0f36_4cb1_bbb4_5f1009f833dc.slice/crio-566c71749ee5fc5b41c0267b0756a7bc6238821a53df87f734fdeaece256c063 WatchSource:0}: Error finding container 566c71749ee5fc5b41c0267b0756a7bc6238821a53df87f734fdeaece256c063: Status 404 returned error can't find the container with id 566c71749ee5fc5b41c0267b0756a7bc6238821a53df87f734fdeaece256c063 Apr 23 13:33:53.908503 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:53.908479 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lwwp4" event={"ID":"04f56a99-0f36-4cb1-bbb4-5f1009f833dc","Type":"ContainerStarted","Data":"566c71749ee5fc5b41c0267b0756a7bc6238821a53df87f734fdeaece256c063"} Apr 23 13:33:54.912654 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:54.912604 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lwwp4" event={"ID":"04f56a99-0f36-4cb1-bbb4-5f1009f833dc","Type":"ContainerStarted","Data":"461131c824ccb77a384be95f42172d84343b55bb4a3b0202cd2ce4e6ef71bb0d"} Apr 23 13:33:54.929000 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:33:54.928955 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lwwp4" podStartSLOduration=1.928939387 podStartE2EDuration="1.928939387s" podCreationTimestamp="2026-04-23 13:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:33:54.92856532 +0000 UTC m=+106.868739782" watchObservedRunningTime="2026-04-23 13:33:54.928939387 +0000 UTC m=+106.869113845" Apr 23 13:34:00.879035 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:00.879004 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qgrhz_b35db408-8233-451b-984c-90d90df7a815/dns-node-resolver/0.log" Apr 23 13:34:01.679052 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:01.679025 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mx792_01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c/node-ca/0.log" Apr 23 13:34:18.370952 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:18.370917 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs\") pod \"network-metrics-daemon-ctn87\" (UID: \"e9cbb1cc-dcfc-4dac-99b7-8363fbef7774\") " pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:34:18.373321 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:18.373284 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9cbb1cc-dcfc-4dac-99b7-8363fbef7774-metrics-certs\") pod \"network-metrics-daemon-ctn87\" (UID: \"e9cbb1cc-dcfc-4dac-99b7-8363fbef7774\") " pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:34:18.500077 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:18.500050 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rl8cq\"" Apr 23 13:34:18.508024 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:18.508006 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctn87" Apr 23 13:34:18.625099 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:18.625012 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ctn87"] Apr 23 13:34:18.629025 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:34:18.628994 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9cbb1cc_dcfc_4dac_99b7_8363fbef7774.slice/crio-79ff96b2fbf5fc7be6def2a7014751c0902e5e81b9bb1acbe3c3a5e4c1c1a793 WatchSource:0}: Error finding container 79ff96b2fbf5fc7be6def2a7014751c0902e5e81b9bb1acbe3c3a5e4c1c1a793: Status 404 returned error can't find the container with id 79ff96b2fbf5fc7be6def2a7014751c0902e5e81b9bb1acbe3c3a5e4c1c1a793 Apr 23 13:34:18.970618 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:18.970577 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ctn87" event={"ID":"e9cbb1cc-dcfc-4dac-99b7-8363fbef7774","Type":"ContainerStarted","Data":"79ff96b2fbf5fc7be6def2a7014751c0902e5e81b9bb1acbe3c3a5e4c1c1a793"} Apr 23 13:34:19.974761 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:19.974723 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ctn87" event={"ID":"e9cbb1cc-dcfc-4dac-99b7-8363fbef7774","Type":"ContainerStarted","Data":"3ab4525268ea07132ca2720ecf19d8e2e10921d3c03ae686d807569ba6bdfce0"} Apr 23 13:34:19.974761 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:19.974763 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ctn87" event={"ID":"e9cbb1cc-dcfc-4dac-99b7-8363fbef7774","Type":"ContainerStarted","Data":"a282bb8e98361643f57aa09315e29cddfc3cc688e2460d760468147b294c3fdf"} Apr 23 13:34:19.990870 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:19.990831 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ctn87" podStartSLOduration=131.025749522 podStartE2EDuration="2m11.990818132s" podCreationTimestamp="2026-04-23 13:32:08 +0000 UTC" firstStartedPulling="2026-04-23 13:34:18.630774412 +0000 UTC m=+130.570948849" lastFinishedPulling="2026-04-23 13:34:19.595843022 +0000 UTC m=+131.536017459" observedRunningTime="2026-04-23 13:34:19.990158212 +0000 UTC m=+131.930332649" watchObservedRunningTime="2026-04-23 13:34:19.990818132 +0000 UTC m=+131.930992588" Apr 23 13:34:26.491681 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.491644 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-vc88g"] Apr 23 13:34:26.493466 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.493451 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vc88g" Apr 23 13:34:26.497013 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.496978 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-9khvh\"" Apr 23 13:34:26.497148 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.497029 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 13:34:26.497148 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.497078 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 13:34:26.497148 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.497115 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 13:34:26.497314 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.497215 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 13:34:26.516427 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.516403 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vc88g"] Apr 23 13:34:26.593202 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.593178 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-59f44d7dcb-lmrp5"] Apr 23 13:34:26.594791 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.594773 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.605521 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.605504 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 13:34:26.605618 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.605578 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nw8fz\"" Apr 23 13:34:26.605668 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.605647 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 13:34:26.605792 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.605779 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 13:34:26.606128 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.606113 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 13:34:26.620420 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.620399 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-59f44d7dcb-lmrp5"] Apr 23 13:34:26.631406 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.631388 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/74c3988c-0d03-4871-8bc1-3e6fe2005562-data-volume\") pod \"insights-runtime-extractor-vc88g\" (UID: \"74c3988c-0d03-4871-8bc1-3e6fe2005562\") " pod="openshift-insights/insights-runtime-extractor-vc88g" Apr 23 13:34:26.631494 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.631441 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/74c3988c-0d03-4871-8bc1-3e6fe2005562-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vc88g\" (UID: \"74c3988c-0d03-4871-8bc1-3e6fe2005562\") " pod="openshift-insights/insights-runtime-extractor-vc88g" Apr 23 13:34:26.631494 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.631460 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4dsx\" (UniqueName: \"kubernetes.io/projected/74c3988c-0d03-4871-8bc1-3e6fe2005562-kube-api-access-p4dsx\") pod \"insights-runtime-extractor-vc88g\" (UID: \"74c3988c-0d03-4871-8bc1-3e6fe2005562\") " pod="openshift-insights/insights-runtime-extractor-vc88g" Apr 23 13:34:26.631598 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.631512 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/74c3988c-0d03-4871-8bc1-3e6fe2005562-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vc88g\" (UID: \"74c3988c-0d03-4871-8bc1-3e6fe2005562\") " pod="openshift-insights/insights-runtime-extractor-vc88g" Apr 23 13:34:26.631598 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.631561 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/74c3988c-0d03-4871-8bc1-3e6fe2005562-crio-socket\") pod \"insights-runtime-extractor-vc88g\" (UID: \"74c3988c-0d03-4871-8bc1-3e6fe2005562\") " pod="openshift-insights/insights-runtime-extractor-vc88g" Apr 23 13:34:26.732742 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.732714 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4dsx\" (UniqueName: \"kubernetes.io/projected/74c3988c-0d03-4871-8bc1-3e6fe2005562-kube-api-access-p4dsx\") pod \"insights-runtime-extractor-vc88g\" (UID: \"74c3988c-0d03-4871-8bc1-3e6fe2005562\") " pod="openshift-insights/insights-runtime-extractor-vc88g" Apr 23 13:34:26.732865 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.732752 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-installation-pull-secrets\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.732865 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.732786 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cbnx\" (UniqueName: \"kubernetes.io/projected/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-kube-api-access-7cbnx\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.732865 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.732832 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/74c3988c-0d03-4871-8bc1-3e6fe2005562-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vc88g\" (UID: \"74c3988c-0d03-4871-8bc1-3e6fe2005562\") " pod="openshift-insights/insights-runtime-extractor-vc88g" Apr 23 13:34:26.733006 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.732868 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-bound-sa-token\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.733059 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.733017 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/74c3988c-0d03-4871-8bc1-3e6fe2005562-crio-socket\") pod \"insights-runtime-extractor-vc88g\" (UID: \"74c3988c-0d03-4871-8bc1-3e6fe2005562\") " pod="openshift-insights/insights-runtime-extractor-vc88g" Apr 23 13:34:26.733107 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.733052 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-image-registry-private-configuration\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.733160 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.733106 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/74c3988c-0d03-4871-8bc1-3e6fe2005562-data-volume\") pod \"insights-runtime-extractor-vc88g\" (UID: \"74c3988c-0d03-4871-8bc1-3e6fe2005562\") " pod="openshift-insights/insights-runtime-extractor-vc88g" Apr 23 13:34:26.733160 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.733120 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/74c3988c-0d03-4871-8bc1-3e6fe2005562-crio-socket\") pod \"insights-runtime-extractor-vc88g\" (UID: \"74c3988c-0d03-4871-8bc1-3e6fe2005562\") " pod="openshift-insights/insights-runtime-extractor-vc88g" Apr 23 13:34:26.733160 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.733135 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-registry-certificates\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.733302 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.733195 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-registry-tls\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.733356 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.733344 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/74c3988c-0d03-4871-8bc1-3e6fe2005562-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vc88g\" (UID: \"74c3988c-0d03-4871-8bc1-3e6fe2005562\") " pod="openshift-insights/insights-runtime-extractor-vc88g" Apr 23 13:34:26.733405 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.733379 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/74c3988c-0d03-4871-8bc1-3e6fe2005562-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vc88g\" (UID: \"74c3988c-0d03-4871-8bc1-3e6fe2005562\") " pod="openshift-insights/insights-runtime-extractor-vc88g" Apr 23 13:34:26.733405 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.733385 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/74c3988c-0d03-4871-8bc1-3e6fe2005562-data-volume\") pod \"insights-runtime-extractor-vc88g\" (UID: \"74c3988c-0d03-4871-8bc1-3e6fe2005562\") " pod="openshift-insights/insights-runtime-extractor-vc88g" Apr 23 13:34:26.733510 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.733386 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-ca-trust-extracted\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.733510 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.733458 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-trusted-ca\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.735750 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.735727 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/74c3988c-0d03-4871-8bc1-3e6fe2005562-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vc88g\" (UID: \"74c3988c-0d03-4871-8bc1-3e6fe2005562\") " pod="openshift-insights/insights-runtime-extractor-vc88g" Apr 23 13:34:26.741237 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.741219 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4dsx\" (UniqueName: \"kubernetes.io/projected/74c3988c-0d03-4871-8bc1-3e6fe2005562-kube-api-access-p4dsx\") pod \"insights-runtime-extractor-vc88g\" (UID: \"74c3988c-0d03-4871-8bc1-3e6fe2005562\") " pod="openshift-insights/insights-runtime-extractor-vc88g" Apr 23 13:34:26.802898 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.802841 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vc88g" Apr 23 13:34:26.833899 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.833865 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cbnx\" (UniqueName: \"kubernetes.io/projected/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-kube-api-access-7cbnx\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.833899 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.833897 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-bound-sa-token\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.834074 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.833934 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-image-registry-private-configuration\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.834074 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.833968 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-registry-certificates\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.834074 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.834003 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-registry-tls\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.834239 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.834110 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-ca-trust-extracted\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.834239 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.834150 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-trusted-ca\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.834239 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.834189 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-installation-pull-secrets\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.834640 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.834614 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-ca-trust-extracted\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.834910 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.834884 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-registry-certificates\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.835075 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.835057 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-trusted-ca\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.836506 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.836488 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-image-registry-private-configuration\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.836841 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.836824 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-registry-tls\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.837118 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.837098 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-installation-pull-secrets\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.849709 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.849664 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cbnx\" (UniqueName: \"kubernetes.io/projected/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-kube-api-access-7cbnx\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.850278 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.850250 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1f89c38d-dd50-4f03-a48f-87d000b0dd2b-bound-sa-token\") pod \"image-registry-59f44d7dcb-lmrp5\" (UID: \"1f89c38d-dd50-4f03-a48f-87d000b0dd2b\") " pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.903620 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.903594 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:26.920129 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.920096 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vc88g"] Apr 23 13:34:26.922765 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:34:26.922737 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74c3988c_0d03_4871_8bc1_3e6fe2005562.slice/crio-f63136c1c89607305b60f8a4eafd46c72d9cce33102f49a89657498c642a73f4 WatchSource:0}: Error finding container f63136c1c89607305b60f8a4eafd46c72d9cce33102f49a89657498c642a73f4: Status 404 returned error can't find the container with id f63136c1c89607305b60f8a4eafd46c72d9cce33102f49a89657498c642a73f4 Apr 23 13:34:26.993691 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.993631 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vc88g" event={"ID":"74c3988c-0d03-4871-8bc1-3e6fe2005562","Type":"ContainerStarted","Data":"4ce7d25ac047e5db9d70eaa82e31f557a1626dd21a0bb0af02b55ff204daba0f"} Apr 23 13:34:26.993691 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:26.993677 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vc88g" event={"ID":"74c3988c-0d03-4871-8bc1-3e6fe2005562","Type":"ContainerStarted","Data":"f63136c1c89607305b60f8a4eafd46c72d9cce33102f49a89657498c642a73f4"} Apr 23 13:34:27.028050 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:27.028011 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-59f44d7dcb-lmrp5"] Apr 23 13:34:27.031541 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:34:27.031498 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f89c38d_dd50_4f03_a48f_87d000b0dd2b.slice/crio-253f86e7baacf7f5ddfaeb92293c34f54abd2db1924ff632047e9ab8abdd72c0 WatchSource:0}: Error finding container 253f86e7baacf7f5ddfaeb92293c34f54abd2db1924ff632047e9ab8abdd72c0: Status 404 returned error can't find the container with id 253f86e7baacf7f5ddfaeb92293c34f54abd2db1924ff632047e9ab8abdd72c0 Apr 23 13:34:27.998233 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:27.998192 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vc88g" event={"ID":"74c3988c-0d03-4871-8bc1-3e6fe2005562","Type":"ContainerStarted","Data":"57cfc24b76004564bce7819d48a18b5f4361a5926e983aa998328b324d0bd8d4"} Apr 23 13:34:27.999357 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:27.999334 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" event={"ID":"1f89c38d-dd50-4f03-a48f-87d000b0dd2b","Type":"ContainerStarted","Data":"3019f58831acd28ba8c234abb12234ddb322c1ac573a32b7ca568f4028e191a0"} Apr 23 13:34:27.999461 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:27.999364 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" event={"ID":"1f89c38d-dd50-4f03-a48f-87d000b0dd2b","Type":"ContainerStarted","Data":"253f86e7baacf7f5ddfaeb92293c34f54abd2db1924ff632047e9ab8abdd72c0"} Apr 23 13:34:27.999461 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:27.999453 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:34:28.018192 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:28.018147 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" podStartSLOduration=2.018135866 podStartE2EDuration="2.018135866s" podCreationTimestamp="2026-04-23 13:34:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:34:28.017561686 +0000 UTC m=+139.957736141" watchObservedRunningTime="2026-04-23 13:34:28.018135866 +0000 UTC m=+139.958310325" Apr 23 13:34:30.006909 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:30.006867 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vc88g" event={"ID":"74c3988c-0d03-4871-8bc1-3e6fe2005562","Type":"ContainerStarted","Data":"9f008eb3b8e9bcdf377c99b7b7dc6845d66ee2310c9d1215346194ae20ddc7b3"} Apr 23 13:34:30.036206 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:30.036160 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-vc88g" podStartSLOduration=1.930912223 podStartE2EDuration="4.036146752s" podCreationTimestamp="2026-04-23 13:34:26 +0000 UTC" firstStartedPulling="2026-04-23 13:34:26.97658353 +0000 UTC m=+138.916757966" lastFinishedPulling="2026-04-23 13:34:29.081818059 +0000 UTC m=+141.021992495" observedRunningTime="2026-04-23 13:34:30.035713111 +0000 UTC m=+141.975887591" watchObservedRunningTime="2026-04-23 13:34:30.036146752 +0000 UTC m=+141.976321221" Apr 23 13:34:36.113162 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.113131 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2qd4l"] Apr 23 13:34:36.115638 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.115617 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.118576 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.118551 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-24rls\"" Apr 23 13:34:36.118576 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.118570 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 13:34:36.118750 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.118552 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 13:34:36.119339 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.119321 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 13:34:36.119421 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.119371 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 13:34:36.119421 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.119332 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 13:34:36.119421 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.119344 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 13:34:36.307868 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.307843 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-sys\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.307986 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.307872 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-node-exporter-accelerators-collector-config\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.307986 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.307902 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-node-exporter-wtmp\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.308056 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.307984 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-node-exporter-tls\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.308056 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.308027 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42fww\" (UniqueName: \"kubernetes.io/projected/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-kube-api-access-42fww\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.308056 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.308045 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-node-exporter-textfile\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.308143 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.308085 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-root\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.308143 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.308103 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.308143 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.308137 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-metrics-client-ca\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.408986 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.408912 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-node-exporter-textfile\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.408986 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.408954 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-root\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.408986 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.408974 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.409182 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.409049 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-root\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.409182 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.409103 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-metrics-client-ca\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.409182 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.409156 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-sys\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.409318 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.409185 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-node-exporter-accelerators-collector-config\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.409318 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.409210 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-sys\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.409318 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.409217 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-node-exporter-wtmp\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.409318 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.409256 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-node-exporter-tls\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.409318 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.409260 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-node-exporter-textfile\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.409318 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.409314 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42fww\" (UniqueName: \"kubernetes.io/projected/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-kube-api-access-42fww\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.409637 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.409415 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-node-exporter-wtmp\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.409812 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.409788 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-metrics-client-ca\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.410085 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.410057 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-node-exporter-accelerators-collector-config\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.411985 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.411963 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-node-exporter-tls\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.412029 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.412014 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.418722 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.418690 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42fww\" (UniqueName: \"kubernetes.io/projected/9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f-kube-api-access-42fww\") pod \"node-exporter-2qd4l\" (UID: \"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f\") " pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.425419 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:36.425395 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2qd4l" Apr 23 13:34:36.435228 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:34:36.435195 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b0c9cfe_364c_4cc2_b3a7_574e52e2fe9f.slice/crio-623b07474a5ccae55800ae51b3ba8a054874ca3e4701ba52621119c473b0c901 WatchSource:0}: Error finding container 623b07474a5ccae55800ae51b3ba8a054874ca3e4701ba52621119c473b0c901: Status 404 returned error can't find the container with id 623b07474a5ccae55800ae51b3ba8a054874ca3e4701ba52621119c473b0c901 Apr 23 13:34:37.024661 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.024626 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2qd4l" event={"ID":"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f","Type":"ContainerStarted","Data":"623b07474a5ccae55800ae51b3ba8a054874ca3e4701ba52621119c473b0c901"} Apr 23 13:34:37.194766 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.194741 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:34:37.201728 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.197772 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.202905 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.202619 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 13:34:37.202905 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.202690 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 13:34:37.202905 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.202752 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 13:34:37.203117 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.202906 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 13:34:37.203117 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.202918 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 13:34:37.203117 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.202909 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 13:34:37.203262 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.203247 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-5tgj2\"" Apr 23 13:34:37.203550 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.203434 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 13:34:37.203550 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.203452 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 13:34:37.207508 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.207488 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 13:34:37.211192 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.211150 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:34:37.315244 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.315176 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.315244 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.315219 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-config-volume\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.315378 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.315287 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.315378 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.315317 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92f78839-a8c4-4f1c-9e12-af32079bdeaf-config-out\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.315378 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.315347 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92f78839-a8c4-4f1c-9e12-af32079bdeaf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.315482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.315363 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.315482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.315418 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92f78839-a8c4-4f1c-9e12-af32079bdeaf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.315482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.315457 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/92f78839-a8c4-4f1c-9e12-af32079bdeaf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.315602 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.315501 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.315602 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.315561 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.315602 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.315584 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h77kc\" (UniqueName: \"kubernetes.io/projected/92f78839-a8c4-4f1c-9e12-af32079bdeaf-kube-api-access-h77kc\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.315705 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.315606 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92f78839-a8c4-4f1c-9e12-af32079bdeaf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.315705 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.315623 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-web-config\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.416908 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.416884 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-config-volume\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.416983 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.416917 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.416983 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.416939 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92f78839-a8c4-4f1c-9e12-af32079bdeaf-config-out\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.417091 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.417067 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92f78839-a8c4-4f1c-9e12-af32079bdeaf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.417135 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.417093 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.417135 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.417118 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92f78839-a8c4-4f1c-9e12-af32079bdeaf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.417301 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.417274 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/92f78839-a8c4-4f1c-9e12-af32079bdeaf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.417359 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.417325 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.417434 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.417359 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.417434 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.417393 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h77kc\" (UniqueName: \"kubernetes.io/projected/92f78839-a8c4-4f1c-9e12-af32079bdeaf-kube-api-access-h77kc\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.417548 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.417431 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92f78839-a8c4-4f1c-9e12-af32079bdeaf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.417548 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.417467 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-web-config\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.417548 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.417508 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.417876 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:34:37.417667 2581 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 23 13:34:37.417931 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.417865 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/92f78839-a8c4-4f1c-9e12-af32079bdeaf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.417931 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:34:37.417894 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-main-tls podName:92f78839-a8c4-4f1c-9e12-af32079bdeaf nodeName:}" failed. No retries permitted until 2026-04-23 13:34:37.917713199 +0000 UTC m=+149.857887636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "92f78839-a8c4-4f1c-9e12-af32079bdeaf") : secret "alertmanager-main-tls" not found Apr 23 13:34:37.418605 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.418256 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92f78839-a8c4-4f1c-9e12-af32079bdeaf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.419914 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.419885 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92f78839-a8c4-4f1c-9e12-af32079bdeaf-config-out\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.420290 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.420230 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-config-volume\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.420290 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.420240 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.420578 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.420559 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-web-config\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.420857 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.420831 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92f78839-a8c4-4f1c-9e12-af32079bdeaf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.420978 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.420958 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92f78839-a8c4-4f1c-9e12-af32079bdeaf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.421040 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.420993 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.421040 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.421033 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.421142 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.421130 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.427177 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.427158 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h77kc\" (UniqueName: \"kubernetes.io/projected/92f78839-a8c4-4f1c-9e12-af32079bdeaf-kube-api-access-h77kc\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.921922 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.921893 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:37.924255 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:37.924231 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:38.029086 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:38.029055 2581 generic.go:358] "Generic (PLEG): container finished" podID="9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f" containerID="09c15a561e2910fc340424560b7172255f2d9bc578087dac9ce990e463496d02" exitCode=0 Apr 23 13:34:38.029185 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:38.029091 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2qd4l" event={"ID":"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f","Type":"ContainerDied","Data":"09c15a561e2910fc340424560b7172255f2d9bc578087dac9ce990e463496d02"} Apr 23 13:34:38.112097 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:38.112078 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:38.257382 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:38.257356 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:34:38.259769 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:34:38.259745 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92f78839_a8c4_4f1c_9e12_af32079bdeaf.slice/crio-7d8139ea93833740f0277f20fd3173a973039bd9ef6a941ab9ea667f3ce38c60 WatchSource:0}: Error finding container 7d8139ea93833740f0277f20fd3173a973039bd9ef6a941ab9ea667f3ce38c60: Status 404 returned error can't find the container with id 7d8139ea93833740f0277f20fd3173a973039bd9ef6a941ab9ea667f3ce38c60 Apr 23 13:34:39.033752 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:39.033717 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2qd4l" event={"ID":"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f","Type":"ContainerStarted","Data":"a24fd00238c3d63a0330679222c788165452c9cdb57e703f4ff41b570ed576d3"} Apr 23 13:34:39.033752 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:39.033757 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2qd4l" event={"ID":"9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f","Type":"ContainerStarted","Data":"59d27e6bec6072aa12c363f2e6e2241cf0032ee0bc1344898fe06dac77890c3d"} Apr 23 13:34:39.034810 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:39.034774 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"92f78839-a8c4-4f1c-9e12-af32079bdeaf","Type":"ContainerStarted","Data":"7d8139ea93833740f0277f20fd3173a973039bd9ef6a941ab9ea667f3ce38c60"} Apr 23 13:34:39.055582 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:39.055541 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2qd4l" podStartSLOduration=2.383771105 podStartE2EDuration="3.05551088s" podCreationTimestamp="2026-04-23 13:34:36 +0000 UTC" firstStartedPulling="2026-04-23 13:34:36.437671528 +0000 UTC m=+148.377845966" lastFinishedPulling="2026-04-23 13:34:37.109411305 +0000 UTC m=+149.049585741" observedRunningTime="2026-04-23 13:34:39.054489303 +0000 UTC m=+150.994663763" watchObservedRunningTime="2026-04-23 13:34:39.05551088 +0000 UTC m=+150.995685338" Apr 23 13:34:40.039685 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:40.039641 2581 generic.go:358] "Generic (PLEG): container finished" podID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerID="b34451eb2121fdc819c55c7d4b4f07d85321e4aa3c267fd993d9c6f4870e0de7" exitCode=0 Apr 23 13:34:40.040117 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:40.039737 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"92f78839-a8c4-4f1c-9e12-af32079bdeaf","Type":"ContainerDied","Data":"b34451eb2121fdc819c55c7d4b4f07d85321e4aa3c267fd993d9c6f4870e0de7"} Apr 23 13:34:41.330811 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.330738 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-586489cc6d-wv59j"] Apr 23 13:34:41.333087 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.333068 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.335895 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.335608 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 13:34:41.335895 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.335673 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-jjfdc\"" Apr 23 13:34:41.335895 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.335723 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 13:34:41.335895 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.335673 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 13:34:41.336190 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.336009 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 13:34:41.336190 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.336011 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 13:34:41.340636 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.340611 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 13:34:41.346056 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.346023 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-586489cc6d-wv59j"] Apr 23 13:34:41.347159 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.347129 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a151d0c0-6a27-40f5-b47a-6aa6b008093f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.347259 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.347178 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a151d0c0-6a27-40f5-b47a-6aa6b008093f-federate-client-tls\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.347313 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.347261 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a151d0c0-6a27-40f5-b47a-6aa6b008093f-metrics-client-ca\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.347313 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.347306 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a151d0c0-6a27-40f5-b47a-6aa6b008093f-serving-certs-ca-bundle\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.347418 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.347360 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a151d0c0-6a27-40f5-b47a-6aa6b008093f-secret-telemeter-client\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.347471 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.347430 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a151d0c0-6a27-40f5-b47a-6aa6b008093f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.347471 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.347456 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a151d0c0-6a27-40f5-b47a-6aa6b008093f-telemeter-client-tls\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.347587 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.347483 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77jvs\" (UniqueName: \"kubernetes.io/projected/a151d0c0-6a27-40f5-b47a-6aa6b008093f-kube-api-access-77jvs\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.448552 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.448512 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a151d0c0-6a27-40f5-b47a-6aa6b008093f-secret-telemeter-client\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.448665 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.448610 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a151d0c0-6a27-40f5-b47a-6aa6b008093f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.448665 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.448641 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a151d0c0-6a27-40f5-b47a-6aa6b008093f-telemeter-client-tls\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.448776 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.448667 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77jvs\" (UniqueName: \"kubernetes.io/projected/a151d0c0-6a27-40f5-b47a-6aa6b008093f-kube-api-access-77jvs\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.448943 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.448917 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a151d0c0-6a27-40f5-b47a-6aa6b008093f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.449044 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.448975 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a151d0c0-6a27-40f5-b47a-6aa6b008093f-federate-client-tls\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.449044 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.449013 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a151d0c0-6a27-40f5-b47a-6aa6b008093f-metrics-client-ca\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.449156 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.449045 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a151d0c0-6a27-40f5-b47a-6aa6b008093f-serving-certs-ca-bundle\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.449567 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.449477 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a151d0c0-6a27-40f5-b47a-6aa6b008093f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.449922 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.449878 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a151d0c0-6a27-40f5-b47a-6aa6b008093f-serving-certs-ca-bundle\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.450328 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.450302 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a151d0c0-6a27-40f5-b47a-6aa6b008093f-metrics-client-ca\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.451051 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.451031 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a151d0c0-6a27-40f5-b47a-6aa6b008093f-secret-telemeter-client\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.451191 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.451172 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a151d0c0-6a27-40f5-b47a-6aa6b008093f-telemeter-client-tls\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.451394 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.451377 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a151d0c0-6a27-40f5-b47a-6aa6b008093f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.452148 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.452129 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a151d0c0-6a27-40f5-b47a-6aa6b008093f-federate-client-tls\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.472487 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.472463 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77jvs\" (UniqueName: \"kubernetes.io/projected/a151d0c0-6a27-40f5-b47a-6aa6b008093f-kube-api-access-77jvs\") pod \"telemeter-client-586489cc6d-wv59j\" (UID: \"a151d0c0-6a27-40f5-b47a-6aa6b008093f\") " pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.644475 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.644449 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" Apr 23 13:34:41.780946 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:41.780921 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-586489cc6d-wv59j"] Apr 23 13:34:41.783262 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:34:41.783231 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda151d0c0_6a27_40f5_b47a_6aa6b008093f.slice/crio-3f22ee2c815d47ed7598f2d3475439d6bc5f193001fc22c09ec38d0046927def WatchSource:0}: Error finding container 3f22ee2c815d47ed7598f2d3475439d6bc5f193001fc22c09ec38d0046927def: Status 404 returned error can't find the container with id 3f22ee2c815d47ed7598f2d3475439d6bc5f193001fc22c09ec38d0046927def Apr 23 13:34:42.047163 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.047128 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"92f78839-a8c4-4f1c-9e12-af32079bdeaf","Type":"ContainerStarted","Data":"6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582"} Apr 23 13:34:42.047163 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.047169 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"92f78839-a8c4-4f1c-9e12-af32079bdeaf","Type":"ContainerStarted","Data":"944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289"} Apr 23 13:34:42.047413 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.047179 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"92f78839-a8c4-4f1c-9e12-af32079bdeaf","Type":"ContainerStarted","Data":"4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3"} Apr 23 13:34:42.047413 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.047188 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"92f78839-a8c4-4f1c-9e12-af32079bdeaf","Type":"ContainerStarted","Data":"39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d"} Apr 23 13:34:42.047413 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.047198 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"92f78839-a8c4-4f1c-9e12-af32079bdeaf","Type":"ContainerStarted","Data":"e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b"} Apr 23 13:34:42.048082 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.048060 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" event={"ID":"a151d0c0-6a27-40f5-b47a-6aa6b008093f","Type":"ContainerStarted","Data":"3f22ee2c815d47ed7598f2d3475439d6bc5f193001fc22c09ec38d0046927def"} Apr 23 13:34:42.352344 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.352254 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:34:42.355352 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.355048 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.358737 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.358711 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 13:34:42.358979 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.358953 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 13:34:42.359115 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.359093 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 13:34:42.359289 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.359258 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 13:34:42.359289 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.359258 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 13:34:42.359428 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.359269 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5162j7pc9hkj7\"" Apr 23 13:34:42.360172 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.360151 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-5rw9w\"" Apr 23 13:34:42.360865 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.360609 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 13:34:42.360865 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.360687 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 13:34:42.360865 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.360742 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 13:34:42.360865 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.360693 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 13:34:42.360865 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.360788 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 13:34:42.360865 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.360804 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 13:34:42.364005 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.363453 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 13:34:42.366211 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.366191 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 13:34:42.371499 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.371476 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:34:42.457040 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457003 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkgsb\" (UniqueName: \"kubernetes.io/projected/09b99f39-20d2-44f1-92ac-26437b23a8f2-kube-api-access-hkgsb\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.457198 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457104 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.457268 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457178 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.457268 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457236 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.457370 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457264 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.457370 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457299 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.457370 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457324 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-config\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.457370 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457350 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/09b99f39-20d2-44f1-92ac-26437b23a8f2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.457593 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457414 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.457593 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457448 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.457593 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457469 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.457593 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457492 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.457593 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457514 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.457593 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457562 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/09b99f39-20d2-44f1-92ac-26437b23a8f2-config-out\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.457889 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457597 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.457889 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457613 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.457889 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457639 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.457889 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.457656 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-web-config\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.558165 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558142 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.558286 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558183 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.558286 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558211 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.558447 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558426 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/09b99f39-20d2-44f1-92ac-26437b23a8f2-config-out\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.558505 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558489 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.558569 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558520 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.558630 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558570 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.558630 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558600 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-web-config\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.558719 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558650 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkgsb\" (UniqueName: \"kubernetes.io/projected/09b99f39-20d2-44f1-92ac-26437b23a8f2-kube-api-access-hkgsb\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.558719 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558708 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.558808 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558745 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.558808 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558773 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.558808 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558802 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.558951 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558840 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.558951 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558875 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-config\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.558951 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558898 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/09b99f39-20d2-44f1-92ac-26437b23a8f2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.558951 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558945 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.559130 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.558986 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.559802 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.559775 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.563262 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.561509 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.563262 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.561797 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.563262 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.561935 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.563262 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.562189 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.563262 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.562228 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.563262 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.562250 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.563262 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.562318 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.563262 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.562380 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.563262 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.562721 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.564140 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.563633 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/09b99f39-20d2-44f1-92ac-26437b23a8f2-config-out\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.565154 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.565102 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.565429 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.565398 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.565606 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.565585 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.565946 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.565925 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/09b99f39-20d2-44f1-92ac-26437b23a8f2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.566888 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.566867 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-web-config\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.567269 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.567233 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-config\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.567480 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.567457 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkgsb\" (UniqueName: \"kubernetes.io/projected/09b99f39-20d2-44f1-92ac-26437b23a8f2-kube-api-access-hkgsb\") pod \"prometheus-k8s-0\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.668503 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.668450 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:42.810820 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:42.810792 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:34:42.813258 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:34:42.813226 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09b99f39_20d2_44f1_92ac_26437b23a8f2.slice/crio-8ffc0adaf0b50b1155b9ea799d184c9113cf67042aaae7c311add66325fb4d93 WatchSource:0}: Error finding container 8ffc0adaf0b50b1155b9ea799d184c9113cf67042aaae7c311add66325fb4d93: Status 404 returned error can't find the container with id 8ffc0adaf0b50b1155b9ea799d184c9113cf67042aaae7c311add66325fb4d93 Apr 23 13:34:43.060405 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:43.060322 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09b99f39-20d2-44f1-92ac-26437b23a8f2","Type":"ContainerStarted","Data":"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1"} Apr 23 13:34:43.060405 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:43.060381 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09b99f39-20d2-44f1-92ac-26437b23a8f2","Type":"ContainerStarted","Data":"8ffc0adaf0b50b1155b9ea799d184c9113cf67042aaae7c311add66325fb4d93"} Apr 23 13:34:43.063828 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:43.063793 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"92f78839-a8c4-4f1c-9e12-af32079bdeaf","Type":"ContainerStarted","Data":"0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b"} Apr 23 13:34:43.109943 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:43.109891 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.838678566 podStartE2EDuration="6.109872849s" podCreationTimestamp="2026-04-23 13:34:37 +0000 UTC" firstStartedPulling="2026-04-23 13:34:38.261619368 +0000 UTC m=+150.201793804" lastFinishedPulling="2026-04-23 13:34:42.532813647 +0000 UTC m=+154.472988087" observedRunningTime="2026-04-23 13:34:43.108221886 +0000 UTC m=+155.048396380" watchObservedRunningTime="2026-04-23 13:34:43.109872849 +0000 UTC m=+155.050047312" Apr 23 13:34:44.070801 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:44.070763 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" event={"ID":"a151d0c0-6a27-40f5-b47a-6aa6b008093f","Type":"ContainerStarted","Data":"2e0e36cfcb71759844d618bc7e6b0deaa39783a9e73ac4c7336b06e37f62a4d3"} Apr 23 13:34:44.070801 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:44.070802 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" event={"ID":"a151d0c0-6a27-40f5-b47a-6aa6b008093f","Type":"ContainerStarted","Data":"5ce12303f204c7e0ca060ac0f6d41526158f473aff73f8dfb685b86d668ceda4"} Apr 23 13:34:44.071329 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:44.070819 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" event={"ID":"a151d0c0-6a27-40f5-b47a-6aa6b008093f","Type":"ContainerStarted","Data":"0d5055cb0746b35f275ccd4b0371c1afdbe8b72ad65ee89188935ff1885d4b3f"} Apr 23 13:34:44.072170 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:44.072139 2581 generic.go:358] "Generic (PLEG): container finished" podID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerID="5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1" exitCode=0 Apr 23 13:34:44.072298 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:44.072220 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09b99f39-20d2-44f1-92ac-26437b23a8f2","Type":"ContainerDied","Data":"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1"} Apr 23 13:34:44.099168 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:44.099116 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-586489cc6d-wv59j" podStartSLOduration=1.5443118409999999 podStartE2EDuration="3.099099416s" podCreationTimestamp="2026-04-23 13:34:41 +0000 UTC" firstStartedPulling="2026-04-23 13:34:41.785107341 +0000 UTC m=+153.725281778" lastFinishedPulling="2026-04-23 13:34:43.3398949 +0000 UTC m=+155.280069353" observedRunningTime="2026-04-23 13:34:44.097465207 +0000 UTC m=+156.037639667" watchObservedRunningTime="2026-04-23 13:34:44.099099416 +0000 UTC m=+156.039273892" Apr 23 13:34:44.431378 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:34:44.431349 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lk6nc" podUID="58b54dde-cfd8-43cb-8a0f-80463679527c" Apr 23 13:34:44.437485 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:34:44.437460 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-slr9t" podUID="9d0ae972-b2b8-41fe-a688-e9f33be2d8f1" Apr 23 13:34:45.075174 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:45.075143 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-slr9t" Apr 23 13:34:45.075617 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:45.075213 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lk6nc" Apr 23 13:34:46.908696 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:46.908635 2581 patch_prober.go:28] interesting pod/image-registry-59f44d7dcb-lmrp5 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 13:34:46.909073 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:46.908726 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" podUID="1f89c38d-dd50-4f03-a48f-87d000b0dd2b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:34:48.085814 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:48.085780 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09b99f39-20d2-44f1-92ac-26437b23a8f2","Type":"ContainerStarted","Data":"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea"} Apr 23 13:34:48.085814 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:48.085814 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09b99f39-20d2-44f1-92ac-26437b23a8f2","Type":"ContainerStarted","Data":"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b"} Apr 23 13:34:48.704723 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:48.704665 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" podUID="d4e483b9-2cd1-4e40-97d6-718ee57eb42b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 13:34:49.005842 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:49.005818 2581 patch_prober.go:28] interesting pod/image-registry-59f44d7dcb-lmrp5 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 13:34:49.005931 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:49.005863 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" podUID="1f89c38d-dd50-4f03-a48f-87d000b0dd2b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:34:49.091881 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:49.091854 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09b99f39-20d2-44f1-92ac-26437b23a8f2","Type":"ContainerStarted","Data":"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12"} Apr 23 13:34:49.092177 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:49.091889 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09b99f39-20d2-44f1-92ac-26437b23a8f2","Type":"ContainerStarted","Data":"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444"} Apr 23 13:34:49.092177 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:49.091899 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09b99f39-20d2-44f1-92ac-26437b23a8f2","Type":"ContainerStarted","Data":"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720"} Apr 23 13:34:49.092177 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:49.091907 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09b99f39-20d2-44f1-92ac-26437b23a8f2","Type":"ContainerStarted","Data":"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9"} Apr 23 13:34:49.119752 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:49.119687 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.3740053740000002 podStartE2EDuration="7.11967439s" podCreationTimestamp="2026-04-23 13:34:42 +0000 UTC" firstStartedPulling="2026-04-23 13:34:44.073352988 +0000 UTC m=+156.013527428" lastFinishedPulling="2026-04-23 13:34:48.819021992 +0000 UTC m=+160.759196444" observedRunningTime="2026-04-23 13:34:49.117083674 +0000 UTC m=+161.057258133" watchObservedRunningTime="2026-04-23 13:34:49.11967439 +0000 UTC m=+161.059848848" Apr 23 13:34:49.316011 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:49.315935 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:34:49.316011 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:49.315966 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert\") pod \"ingress-canary-slr9t\" (UID: \"9d0ae972-b2b8-41fe-a688-e9f33be2d8f1\") " pod="openshift-ingress-canary/ingress-canary-slr9t" Apr 23 13:34:49.318503 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:49.318467 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58b54dde-cfd8-43cb-8a0f-80463679527c-metrics-tls\") pod \"dns-default-lk6nc\" (UID: \"58b54dde-cfd8-43cb-8a0f-80463679527c\") " pod="openshift-dns/dns-default-lk6nc" Apr 23 13:34:49.318503 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:49.318484 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d0ae972-b2b8-41fe-a688-e9f33be2d8f1-cert\") pod \"ingress-canary-slr9t\" (UID: \"9d0ae972-b2b8-41fe-a688-e9f33be2d8f1\") " pod="openshift-ingress-canary/ingress-canary-slr9t" Apr 23 13:34:49.578195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:49.578106 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-8bjdp\"" Apr 23 13:34:49.578329 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:49.578238 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vtx8h\"" Apr 23 13:34:49.586130 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:49.586107 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lk6nc" Apr 23 13:34:49.586221 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:49.586108 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-slr9t" Apr 23 13:34:49.714566 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:49.714518 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lk6nc"] Apr 23 13:34:49.717048 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:34:49.717021 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58b54dde_cfd8_43cb_8a0f_80463679527c.slice/crio-97b4b24b8c60a88ef112949d963034f3e285cf9ab4295d72b9a4de3251ac62df WatchSource:0}: Error finding container 97b4b24b8c60a88ef112949d963034f3e285cf9ab4295d72b9a4de3251ac62df: Status 404 returned error can't find the container with id 97b4b24b8c60a88ef112949d963034f3e285cf9ab4295d72b9a4de3251ac62df Apr 23 13:34:49.729292 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:49.729268 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-slr9t"] Apr 23 13:34:49.732340 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:34:49.732315 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d0ae972_b2b8_41fe_a688_e9f33be2d8f1.slice/crio-5934a2efd2c09f04a1c91db223c244e0333bb6f6eb15a25f751e7c590eab0e5a WatchSource:0}: Error finding container 5934a2efd2c09f04a1c91db223c244e0333bb6f6eb15a25f751e7c590eab0e5a: Status 404 returned error can't find the container with id 5934a2efd2c09f04a1c91db223c244e0333bb6f6eb15a25f751e7c590eab0e5a Apr 23 13:34:50.095937 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:50.095902 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-slr9t" event={"ID":"9d0ae972-b2b8-41fe-a688-e9f33be2d8f1","Type":"ContainerStarted","Data":"5934a2efd2c09f04a1c91db223c244e0333bb6f6eb15a25f751e7c590eab0e5a"} Apr 23 13:34:50.096896 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:50.096871 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lk6nc" event={"ID":"58b54dde-cfd8-43cb-8a0f-80463679527c","Type":"ContainerStarted","Data":"97b4b24b8c60a88ef112949d963034f3e285cf9ab4295d72b9a4de3251ac62df"} Apr 23 13:34:52.109467 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:52.109434 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-slr9t" event={"ID":"9d0ae972-b2b8-41fe-a688-e9f33be2d8f1","Type":"ContainerStarted","Data":"04b2071179416ba7a35fbe11ccea14ff6e65999806fa0b5af57e6bf465b33401"} Apr 23 13:34:52.111001 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:52.110974 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lk6nc" event={"ID":"58b54dde-cfd8-43cb-8a0f-80463679527c","Type":"ContainerStarted","Data":"19925aff46f91a3ce72534ab2d3f2513f2011e65bd2a9625d7358bad366f613f"} Apr 23 13:34:52.111001 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:52.111000 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lk6nc" event={"ID":"58b54dde-cfd8-43cb-8a0f-80463679527c","Type":"ContainerStarted","Data":"8213fcd0e482e1213276525a556fda25bc8b4bd6ba4917ae522a7c8e03a28ef7"} Apr 23 13:34:52.111156 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:52.111114 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lk6nc" Apr 23 13:34:52.126312 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:52.126269 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-slr9t" podStartSLOduration=129.34108397 podStartE2EDuration="2m11.126258158s" podCreationTimestamp="2026-04-23 13:32:41 +0000 UTC" firstStartedPulling="2026-04-23 13:34:49.734233739 +0000 UTC m=+161.674408176" lastFinishedPulling="2026-04-23 13:34:51.519407919 +0000 UTC m=+163.459582364" observedRunningTime="2026-04-23 13:34:52.124616201 +0000 UTC m=+164.064790660" watchObservedRunningTime="2026-04-23 13:34:52.126258158 +0000 UTC m=+164.066432618" Apr 23 13:34:52.142220 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:52.142174 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lk6nc" podStartSLOduration=129.34530256 podStartE2EDuration="2m11.142163342s" podCreationTimestamp="2026-04-23 13:32:41 +0000 UTC" firstStartedPulling="2026-04-23 13:34:49.719139573 +0000 UTC m=+161.659314010" lastFinishedPulling="2026-04-23 13:34:51.516000355 +0000 UTC m=+163.456174792" observedRunningTime="2026-04-23 13:34:52.140236935 +0000 UTC m=+164.080411427" watchObservedRunningTime="2026-04-23 13:34:52.142163342 +0000 UTC m=+164.082337826" Apr 23 13:34:52.669099 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:52.669022 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:56.907429 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:56.907394 2581 patch_prober.go:28] interesting pod/image-registry-59f44d7dcb-lmrp5 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 13:34:56.907830 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:56.907453 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" podUID="1f89c38d-dd50-4f03-a48f-87d000b0dd2b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:34:58.704503 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:58.704465 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" podUID="d4e483b9-2cd1-4e40-97d6-718ee57eb42b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 13:34:59.006105 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:34:59.006032 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-59f44d7dcb-lmrp5" Apr 23 13:35:02.116856 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:02.116806 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lk6nc" Apr 23 13:35:08.704432 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:08.704395 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" podUID="d4e483b9-2cd1-4e40-97d6-718ee57eb42b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 13:35:08.704804 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:08.704459 2581 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" Apr 23 13:35:08.704941 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:08.704912 2581 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"6881755f14f72a45aa090fa905714abd9870bd1760e5109f7325a955e96cea53"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 23 13:35:08.704978 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:08.704961 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" podUID="d4e483b9-2cd1-4e40-97d6-718ee57eb42b" containerName="service-proxy" containerID="cri-o://6881755f14f72a45aa090fa905714abd9870bd1760e5109f7325a955e96cea53" gracePeriod=30 Apr 23 13:35:09.160291 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:09.160203 2581 generic.go:358] "Generic (PLEG): container finished" podID="d4e483b9-2cd1-4e40-97d6-718ee57eb42b" containerID="6881755f14f72a45aa090fa905714abd9870bd1760e5109f7325a955e96cea53" exitCode=2 Apr 23 13:35:09.160291 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:09.160272 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" event={"ID":"d4e483b9-2cd1-4e40-97d6-718ee57eb42b","Type":"ContainerDied","Data":"6881755f14f72a45aa090fa905714abd9870bd1760e5109f7325a955e96cea53"} Apr 23 13:35:09.160464 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:09.160309 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b996fc64-lj694" event={"ID":"d4e483b9-2cd1-4e40-97d6-718ee57eb42b","Type":"ContainerStarted","Data":"475473837e2cd20a0619449447e2c2f98eda2a564fd107879a048e0c247d9e50"} Apr 23 13:35:42.669519 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:42.669485 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:42.688298 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:42.688271 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:43.276696 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:43.276671 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:56.469142 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:56.469109 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:35:56.469574 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:56.469519 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="alertmanager" containerID="cri-o://e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b" gracePeriod=120 Apr 23 13:35:56.469666 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:56.469588 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="kube-rbac-proxy-metric" containerID="cri-o://6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582" gracePeriod=120 Apr 23 13:35:56.469666 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:56.469648 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="prom-label-proxy" containerID="cri-o://0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b" gracePeriod=120 Apr 23 13:35:56.469776 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:56.469606 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="kube-rbac-proxy-web" containerID="cri-o://4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3" gracePeriod=120 Apr 23 13:35:56.469776 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:56.469652 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="config-reloader" containerID="cri-o://39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d" gracePeriod=120 Apr 23 13:35:56.469776 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:56.469618 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="kube-rbac-proxy" containerID="cri-o://944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289" gracePeriod=120 Apr 23 13:35:57.301950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.301914 2581 generic.go:358] "Generic (PLEG): container finished" podID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerID="0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b" exitCode=0 Apr 23 13:35:57.301950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.301941 2581 generic.go:358] "Generic (PLEG): container finished" podID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerID="944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289" exitCode=0 Apr 23 13:35:57.301950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.301948 2581 generic.go:358] "Generic (PLEG): container finished" podID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerID="39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d" exitCode=0 Apr 23 13:35:57.301950 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.301953 2581 generic.go:358] "Generic (PLEG): container finished" podID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerID="e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b" exitCode=0 Apr 23 13:35:57.302189 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.301987 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"92f78839-a8c4-4f1c-9e12-af32079bdeaf","Type":"ContainerDied","Data":"0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b"} Apr 23 13:35:57.302189 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.302021 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"92f78839-a8c4-4f1c-9e12-af32079bdeaf","Type":"ContainerDied","Data":"944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289"} Apr 23 13:35:57.302189 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.302032 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"92f78839-a8c4-4f1c-9e12-af32079bdeaf","Type":"ContainerDied","Data":"39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d"} Apr 23 13:35:57.302189 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.302041 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"92f78839-a8c4-4f1c-9e12-af32079bdeaf","Type":"ContainerDied","Data":"e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b"} Apr 23 13:35:57.709436 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.709411 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:57.725837 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.725809 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/92f78839-a8c4-4f1c-9e12-af32079bdeaf-alertmanager-main-db\") pod \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " Apr 23 13:35:57.725963 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.725862 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92f78839-a8c4-4f1c-9e12-af32079bdeaf-metrics-client-ca\") pod \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " Apr 23 13:35:57.725963 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.725899 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-main-tls\") pod \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " Apr 23 13:35:57.725963 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.725924 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy\") pod \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " Apr 23 13:35:57.725963 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.725957 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92f78839-a8c4-4f1c-9e12-af32079bdeaf-tls-assets\") pod \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " Apr 23 13:35:57.726175 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.725988 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h77kc\" (UniqueName: \"kubernetes.io/projected/92f78839-a8c4-4f1c-9e12-af32079bdeaf-kube-api-access-h77kc\") pod \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " Apr 23 13:35:57.726175 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.726032 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-web-config\") pod \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " Apr 23 13:35:57.726175 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.726058 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-config-volume\") pod \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " Apr 23 13:35:57.726175 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.726087 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92f78839-a8c4-4f1c-9e12-af32079bdeaf-alertmanager-trusted-ca-bundle\") pod \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " Apr 23 13:35:57.726175 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.726113 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy-web\") pod \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " Apr 23 13:35:57.726175 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.726112 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92f78839-a8c4-4f1c-9e12-af32079bdeaf-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "92f78839-a8c4-4f1c-9e12-af32079bdeaf" (UID: "92f78839-a8c4-4f1c-9e12-af32079bdeaf"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:35:57.726491 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.726145 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-cluster-tls-config\") pod \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " Apr 23 13:35:57.726491 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.726235 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " Apr 23 13:35:57.726491 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.726296 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92f78839-a8c4-4f1c-9e12-af32079bdeaf-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "92f78839-a8c4-4f1c-9e12-af32079bdeaf" (UID: "92f78839-a8c4-4f1c-9e12-af32079bdeaf"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:57.726678 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.726495 2581 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/92f78839-a8c4-4f1c-9e12-af32079bdeaf-alertmanager-main-db\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:35:57.726678 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.726517 2581 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92f78839-a8c4-4f1c-9e12-af32079bdeaf-metrics-client-ca\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:35:57.727242 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.726969 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92f78839-a8c4-4f1c-9e12-af32079bdeaf-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "92f78839-a8c4-4f1c-9e12-af32079bdeaf" (UID: "92f78839-a8c4-4f1c-9e12-af32079bdeaf"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:57.728853 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.728822 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "92f78839-a8c4-4f1c-9e12-af32079bdeaf" (UID: "92f78839-a8c4-4f1c-9e12-af32079bdeaf"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:57.729176 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.729148 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "92f78839-a8c4-4f1c-9e12-af32079bdeaf" (UID: "92f78839-a8c4-4f1c-9e12-af32079bdeaf"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:57.729588 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.729562 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f78839-a8c4-4f1c-9e12-af32079bdeaf-kube-api-access-h77kc" (OuterVolumeSpecName: "kube-api-access-h77kc") pod "92f78839-a8c4-4f1c-9e12-af32079bdeaf" (UID: "92f78839-a8c4-4f1c-9e12-af32079bdeaf"). InnerVolumeSpecName "kube-api-access-h77kc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:57.729660 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.729580 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "92f78839-a8c4-4f1c-9e12-af32079bdeaf" (UID: "92f78839-a8c4-4f1c-9e12-af32079bdeaf"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:57.729660 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.729595 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-config-volume" (OuterVolumeSpecName: "config-volume") pod "92f78839-a8c4-4f1c-9e12-af32079bdeaf" (UID: "92f78839-a8c4-4f1c-9e12-af32079bdeaf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:57.731124 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.731093 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f78839-a8c4-4f1c-9e12-af32079bdeaf-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "92f78839-a8c4-4f1c-9e12-af32079bdeaf" (UID: "92f78839-a8c4-4f1c-9e12-af32079bdeaf"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:57.731124 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.731106 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "92f78839-a8c4-4f1c-9e12-af32079bdeaf" (UID: "92f78839-a8c4-4f1c-9e12-af32079bdeaf"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:57.733730 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.733627 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "92f78839-a8c4-4f1c-9e12-af32079bdeaf" (UID: "92f78839-a8c4-4f1c-9e12-af32079bdeaf"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:57.741016 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.740993 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-web-config" (OuterVolumeSpecName: "web-config") pod "92f78839-a8c4-4f1c-9e12-af32079bdeaf" (UID: "92f78839-a8c4-4f1c-9e12-af32079bdeaf"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:57.826828 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.826805 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92f78839-a8c4-4f1c-9e12-af32079bdeaf-config-out\") pod \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\" (UID: \"92f78839-a8c4-4f1c-9e12-af32079bdeaf\") " Apr 23 13:35:57.826945 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.826933 2581 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-cluster-tls-config\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:35:57.826986 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.826949 2581 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:35:57.826986 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.826960 2581 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-main-tls\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:35:57.826986 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.826970 2581 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:35:57.826986 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.826980 2581 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92f78839-a8c4-4f1c-9e12-af32079bdeaf-tls-assets\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:35:57.827107 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.826989 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h77kc\" (UniqueName: \"kubernetes.io/projected/92f78839-a8c4-4f1c-9e12-af32079bdeaf-kube-api-access-h77kc\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:35:57.827107 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.826998 2581 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-web-config\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:35:57.827107 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.827006 2581 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-config-volume\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:35:57.827107 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.827014 2581 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92f78839-a8c4-4f1c-9e12-af32079bdeaf-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:35:57.827107 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.827022 2581 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/92f78839-a8c4-4f1c-9e12-af32079bdeaf-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:35:57.828586 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.828564 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92f78839-a8c4-4f1c-9e12-af32079bdeaf-config-out" (OuterVolumeSpecName: "config-out") pod "92f78839-a8c4-4f1c-9e12-af32079bdeaf" (UID: "92f78839-a8c4-4f1c-9e12-af32079bdeaf"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:35:57.927970 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:57.927950 2581 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92f78839-a8c4-4f1c-9e12-af32079bdeaf-config-out\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:35:58.307343 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.307269 2581 generic.go:358] "Generic (PLEG): container finished" podID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerID="6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582" exitCode=0 Apr 23 13:35:58.307343 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.307294 2581 generic.go:358] "Generic (PLEG): container finished" podID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerID="4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3" exitCode=0 Apr 23 13:35:58.307343 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.307327 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"92f78839-a8c4-4f1c-9e12-af32079bdeaf","Type":"ContainerDied","Data":"6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582"} Apr 23 13:35:58.307511 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.307355 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"92f78839-a8c4-4f1c-9e12-af32079bdeaf","Type":"ContainerDied","Data":"4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3"} Apr 23 13:35:58.307511 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.307366 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"92f78839-a8c4-4f1c-9e12-af32079bdeaf","Type":"ContainerDied","Data":"7d8139ea93833740f0277f20fd3173a973039bd9ef6a941ab9ea667f3ce38c60"} Apr 23 13:35:58.307511 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.307386 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.307511 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.307381 2581 scope.go:117] "RemoveContainer" containerID="0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b" Apr 23 13:35:58.315074 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.315054 2581 scope.go:117] "RemoveContainer" containerID="6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582" Apr 23 13:35:58.322956 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.322938 2581 scope.go:117] "RemoveContainer" containerID="944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289" Apr 23 13:35:58.329103 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.329087 2581 scope.go:117] "RemoveContainer" containerID="4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3" Apr 23 13:35:58.331867 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.331842 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:35:58.335831 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.335812 2581 scope.go:117] "RemoveContainer" containerID="39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d" Apr 23 13:35:58.340078 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.340055 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:35:58.343596 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.343581 2581 scope.go:117] "RemoveContainer" containerID="e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b" Apr 23 13:35:58.349735 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.349718 2581 scope.go:117] "RemoveContainer" containerID="b34451eb2121fdc819c55c7d4b4f07d85321e4aa3c267fd993d9c6f4870e0de7" Apr 23 13:35:58.355868 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.355853 2581 scope.go:117] "RemoveContainer" containerID="0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b" Apr 23 13:35:58.356105 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:35:58.356086 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b\": container with ID starting with 0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b not found: ID does not exist" containerID="0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b" Apr 23 13:35:58.356147 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.356114 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b"} err="failed to get container status \"0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b\": rpc error: code = NotFound desc = could not find container \"0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b\": container with ID starting with 0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b not found: ID does not exist" Apr 23 13:35:58.356147 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.356132 2581 scope.go:117] "RemoveContainer" containerID="6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582" Apr 23 13:35:58.356348 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:35:58.356332 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582\": container with ID starting with 6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582 not found: ID does not exist" containerID="6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582" Apr 23 13:35:58.356387 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.356353 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582"} err="failed to get container status \"6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582\": rpc error: code = NotFound desc = could not find container \"6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582\": container with ID starting with 6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582 not found: ID does not exist" Apr 23 13:35:58.356387 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.356368 2581 scope.go:117] "RemoveContainer" containerID="944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289" Apr 23 13:35:58.356645 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:35:58.356628 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289\": container with ID starting with 944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289 not found: ID does not exist" containerID="944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289" Apr 23 13:35:58.356682 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.356649 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289"} err="failed to get container status \"944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289\": rpc error: code = NotFound desc = could not find container \"944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289\": container with ID starting with 944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289 not found: ID does not exist" Apr 23 13:35:58.356682 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.356664 2581 scope.go:117] "RemoveContainer" containerID="4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3" Apr 23 13:35:58.356888 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:35:58.356872 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3\": container with ID starting with 4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3 not found: ID does not exist" containerID="4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3" Apr 23 13:35:58.356936 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.356894 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3"} err="failed to get container status \"4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3\": rpc error: code = NotFound desc = could not find container \"4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3\": container with ID starting with 4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3 not found: ID does not exist" Apr 23 13:35:58.356936 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.356908 2581 scope.go:117] "RemoveContainer" containerID="39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d" Apr 23 13:35:58.357133 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:35:58.357116 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d\": container with ID starting with 39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d not found: ID does not exist" containerID="39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d" Apr 23 13:35:58.357175 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.357138 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d"} err="failed to get container status \"39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d\": rpc error: code = NotFound desc = could not find container \"39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d\": container with ID starting with 39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d not found: ID does not exist" Apr 23 13:35:58.357175 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.357153 2581 scope.go:117] "RemoveContainer" containerID="e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b" Apr 23 13:35:58.357354 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:35:58.357337 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b\": container with ID starting with e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b not found: ID does not exist" containerID="e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b" Apr 23 13:35:58.357406 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.357358 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b"} err="failed to get container status \"e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b\": rpc error: code = NotFound desc = could not find container \"e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b\": container with ID starting with e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b not found: ID does not exist" Apr 23 13:35:58.357406 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.357371 2581 scope.go:117] "RemoveContainer" containerID="b34451eb2121fdc819c55c7d4b4f07d85321e4aa3c267fd993d9c6f4870e0de7" Apr 23 13:35:58.357611 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:35:58.357595 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b34451eb2121fdc819c55c7d4b4f07d85321e4aa3c267fd993d9c6f4870e0de7\": container with ID starting with b34451eb2121fdc819c55c7d4b4f07d85321e4aa3c267fd993d9c6f4870e0de7 not found: ID does not exist" containerID="b34451eb2121fdc819c55c7d4b4f07d85321e4aa3c267fd993d9c6f4870e0de7" Apr 23 13:35:58.357672 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.357613 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34451eb2121fdc819c55c7d4b4f07d85321e4aa3c267fd993d9c6f4870e0de7"} err="failed to get container status \"b34451eb2121fdc819c55c7d4b4f07d85321e4aa3c267fd993d9c6f4870e0de7\": rpc error: code = NotFound desc = could not find container \"b34451eb2121fdc819c55c7d4b4f07d85321e4aa3c267fd993d9c6f4870e0de7\": container with ID starting with b34451eb2121fdc819c55c7d4b4f07d85321e4aa3c267fd993d9c6f4870e0de7 not found: ID does not exist" Apr 23 13:35:58.357672 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.357624 2581 scope.go:117] "RemoveContainer" containerID="0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b" Apr 23 13:35:58.357882 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.357860 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b"} err="failed to get container status \"0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b\": rpc error: code = NotFound desc = could not find container \"0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b\": container with ID starting with 0bc898248e471c234331eb5da0ac92868801ed90bf6478b64335f2480eeaa84b not found: ID does not exist" Apr 23 13:35:58.357926 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.357888 2581 scope.go:117] "RemoveContainer" containerID="6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582" Apr 23 13:35:58.358085 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.358070 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582"} err="failed to get container status \"6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582\": rpc error: code = NotFound desc = could not find container \"6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582\": container with ID starting with 6ab00d226a881f92dfab12f1da2269323c04c64157b1a52353ca8c5478a2b582 not found: ID does not exist" Apr 23 13:35:58.358142 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.358085 2581 scope.go:117] "RemoveContainer" containerID="944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289" Apr 23 13:35:58.358294 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.358275 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289"} err="failed to get container status \"944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289\": rpc error: code = NotFound desc = could not find container \"944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289\": container with ID starting with 944aa622d70da55341c706bdf2bb16f46a3bcaa8517f6701dc394740054f8289 not found: ID does not exist" Apr 23 13:35:58.358337 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.358293 2581 scope.go:117] "RemoveContainer" containerID="4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3" Apr 23 13:35:58.358466 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.358449 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3"} err="failed to get container status \"4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3\": rpc error: code = NotFound desc = could not find container \"4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3\": container with ID starting with 4a636301a891a7f4b52bc276f2b9a4f9bb7cc75c848d52fa27238aff15a67ed3 not found: ID does not exist" Apr 23 13:35:58.358515 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.358468 2581 scope.go:117] "RemoveContainer" containerID="39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d" Apr 23 13:35:58.358730 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.358709 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d"} err="failed to get container status \"39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d\": rpc error: code = NotFound desc = could not find container \"39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d\": container with ID starting with 39fe5c8a952ed04b7027f017753f5093663e8a45d1f2abfdae8ece4fcdd79d1d not found: ID does not exist" Apr 23 13:35:58.358730 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.358730 2581 scope.go:117] "RemoveContainer" containerID="e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b" Apr 23 13:35:58.358963 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.358946 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b"} err="failed to get container status \"e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b\": rpc error: code = NotFound desc = could not find container \"e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b\": container with ID starting with e66224458d426a8b273eada5895e6c3bf2cb13d244aca9ddaeacddc737f03c3b not found: ID does not exist" Apr 23 13:35:58.359006 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.358964 2581 scope.go:117] "RemoveContainer" containerID="b34451eb2121fdc819c55c7d4b4f07d85321e4aa3c267fd993d9c6f4870e0de7" Apr 23 13:35:58.359157 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.359143 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34451eb2121fdc819c55c7d4b4f07d85321e4aa3c267fd993d9c6f4870e0de7"} err="failed to get container status \"b34451eb2121fdc819c55c7d4b4f07d85321e4aa3c267fd993d9c6f4870e0de7\": rpc error: code = NotFound desc = could not find container \"b34451eb2121fdc819c55c7d4b4f07d85321e4aa3c267fd993d9c6f4870e0de7\": container with ID starting with b34451eb2121fdc819c55c7d4b4f07d85321e4aa3c267fd993d9c6f4870e0de7 not found: ID does not exist" Apr 23 13:35:58.372221 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372198 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:35:58.372502 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372487 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="prom-label-proxy" Apr 23 13:35:58.372502 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372502 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="prom-label-proxy" Apr 23 13:35:58.372648 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372514 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="kube-rbac-proxy-web" Apr 23 13:35:58.372648 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372520 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="kube-rbac-proxy-web" Apr 23 13:35:58.372648 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372546 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="config-reloader" Apr 23 13:35:58.372648 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372555 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="config-reloader" Apr 23 13:35:58.372648 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372570 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="init-config-reloader" Apr 23 13:35:58.372648 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372578 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="init-config-reloader" Apr 23 13:35:58.372648 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372586 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="kube-rbac-proxy-metric" Apr 23 13:35:58.372648 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372591 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="kube-rbac-proxy-metric" Apr 23 13:35:58.372648 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372604 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="kube-rbac-proxy" Apr 23 13:35:58.372648 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372610 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="kube-rbac-proxy" Apr 23 13:35:58.372648 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372617 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="alertmanager" Apr 23 13:35:58.372648 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372622 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="alertmanager" Apr 23 13:35:58.372985 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372673 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="prom-label-proxy" Apr 23 13:35:58.372985 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372682 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="alertmanager" Apr 23 13:35:58.372985 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372690 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="kube-rbac-proxy-web" Apr 23 13:35:58.372985 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372696 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="config-reloader" Apr 23 13:35:58.372985 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372702 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="kube-rbac-proxy-metric" Apr 23 13:35:58.372985 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.372709 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" containerName="kube-rbac-proxy" Apr 23 13:35:58.377566 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.377518 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.380031 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.380009 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 13:35:58.380125 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.380033 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 13:35:58.380339 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.380267 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 13:35:58.380339 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.380285 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 13:35:58.380339 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.380309 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 13:35:58.380339 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.380315 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-5tgj2\"" Apr 23 13:35:58.380594 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.380315 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 13:35:58.380594 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.380288 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 13:35:58.380594 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.380267 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 13:35:58.386390 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.386372 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 13:35:58.389339 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.389322 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:35:58.431050 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.431018 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.431137 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.431061 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43e47fd6-0b15-47e7-a358-859466a20230-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.431137 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.431089 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsfh4\" (UniqueName: \"kubernetes.io/projected/43e47fd6-0b15-47e7-a358-859466a20230-kube-api-access-wsfh4\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.431137 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.431129 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.431264 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.431187 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43e47fd6-0b15-47e7-a358-859466a20230-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.431264 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.431240 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-web-config\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.431324 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.431297 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.431356 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.431337 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/43e47fd6-0b15-47e7-a358-859466a20230-config-out\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.431391 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.431357 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.431423 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.431389 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/43e47fd6-0b15-47e7-a358-859466a20230-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.431455 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.431433 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.431488 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.431471 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-config-volume\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.431520 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.431488 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/43e47fd6-0b15-47e7-a358-859466a20230-tls-assets\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.532489 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.532468 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.532588 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.532499 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43e47fd6-0b15-47e7-a358-859466a20230-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.532588 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.532516 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wsfh4\" (UniqueName: \"kubernetes.io/projected/43e47fd6-0b15-47e7-a358-859466a20230-kube-api-access-wsfh4\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.532694 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.532637 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.532694 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.532665 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43e47fd6-0b15-47e7-a358-859466a20230-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.532783 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.532694 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-web-config\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.532783 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.532711 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.532783 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.532747 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/43e47fd6-0b15-47e7-a358-859466a20230-config-out\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.532783 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.532772 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.532972 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.532811 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/43e47fd6-0b15-47e7-a358-859466a20230-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.532972 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.532847 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.532972 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.532875 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-config-volume\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.532972 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.532899 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/43e47fd6-0b15-47e7-a358-859466a20230-tls-assets\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.533230 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.533206 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43e47fd6-0b15-47e7-a358-859466a20230-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.533758 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.533703 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/43e47fd6-0b15-47e7-a358-859466a20230-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.533856 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.533801 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43e47fd6-0b15-47e7-a358-859466a20230-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.535971 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.535860 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.535971 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.535911 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/43e47fd6-0b15-47e7-a358-859466a20230-tls-assets\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.535971 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.535940 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-web-config\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.535971 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.535966 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.536204 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.536088 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.536204 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.536188 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.536312 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.536277 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/43e47fd6-0b15-47e7-a358-859466a20230-config-out\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.536368 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.536353 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.538011 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.537990 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/43e47fd6-0b15-47e7-a358-859466a20230-config-volume\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.540426 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.540410 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsfh4\" (UniqueName: \"kubernetes.io/projected/43e47fd6-0b15-47e7-a358-859466a20230-kube-api-access-wsfh4\") pod \"alertmanager-main-0\" (UID: \"43e47fd6-0b15-47e7-a358-859466a20230\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.576467 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.576414 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92f78839-a8c4-4f1c-9e12-af32079bdeaf" path="/var/lib/kubelet/pods/92f78839-a8c4-4f1c-9e12-af32079bdeaf/volumes" Apr 23 13:35:58.686431 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.686398 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:58.810611 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:58.810589 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:35:58.813060 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:35:58.813029 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43e47fd6_0b15_47e7_a358_859466a20230.slice/crio-70f753d4d4cd9ee40f12acb8f42960154c3aea35e7a242f929a7ec7ea4c76548 WatchSource:0}: Error finding container 70f753d4d4cd9ee40f12acb8f42960154c3aea35e7a242f929a7ec7ea4c76548: Status 404 returned error can't find the container with id 70f753d4d4cd9ee40f12acb8f42960154c3aea35e7a242f929a7ec7ea4c76548 Apr 23 13:35:59.311847 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:59.311811 2581 generic.go:358] "Generic (PLEG): container finished" podID="43e47fd6-0b15-47e7-a358-859466a20230" containerID="4761ebdae90d28855dcdf455fe849fdbabc2636efc9d15792002a130525b7f2a" exitCode=0 Apr 23 13:35:59.312001 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:59.311858 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43e47fd6-0b15-47e7-a358-859466a20230","Type":"ContainerDied","Data":"4761ebdae90d28855dcdf455fe849fdbabc2636efc9d15792002a130525b7f2a"} Apr 23 13:35:59.312001 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:35:59.311903 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43e47fd6-0b15-47e7-a358-859466a20230","Type":"ContainerStarted","Data":"70f753d4d4cd9ee40f12acb8f42960154c3aea35e7a242f929a7ec7ea4c76548"} Apr 23 13:36:00.319388 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.319353 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43e47fd6-0b15-47e7-a358-859466a20230","Type":"ContainerStarted","Data":"8e06b4b3614d24824cd21d14ee244df9214b2ac9bce52a16388a6d5d7d7b9689"} Apr 23 13:36:00.319794 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.319394 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43e47fd6-0b15-47e7-a358-859466a20230","Type":"ContainerStarted","Data":"6ff88314d1a38a9cbc7283e43144d6b22bf8034d631611a3c8bd6e58ff367e0a"} Apr 23 13:36:00.319794 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.319410 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43e47fd6-0b15-47e7-a358-859466a20230","Type":"ContainerStarted","Data":"aec4f2a4636e2470d29047d12afcf505774fadb4c1a52848acf142b592b659dd"} Apr 23 13:36:00.319794 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.319422 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43e47fd6-0b15-47e7-a358-859466a20230","Type":"ContainerStarted","Data":"fd6d622be5cbb649b04a7a788c2609396cb43cee8d74ae1cf06fe8a94f3b5153"} Apr 23 13:36:00.319794 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.319438 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43e47fd6-0b15-47e7-a358-859466a20230","Type":"ContainerStarted","Data":"1b4e56ef2d0b0d823e4ddaa828580161cc6395c77c4ffa7406893e36b6eaf6af"} Apr 23 13:36:00.319794 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.319450 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43e47fd6-0b15-47e7-a358-859466a20230","Type":"ContainerStarted","Data":"ed461851d0f7ba2351a4bc3364fa2bcc7b93a37c9611528ba2d1bfbbabb16606"} Apr 23 13:36:00.345425 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.345370 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.345352017 podStartE2EDuration="2.345352017s" podCreationTimestamp="2026-04-23 13:35:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:36:00.343610254 +0000 UTC m=+232.283784712" watchObservedRunningTime="2026-04-23 13:36:00.345352017 +0000 UTC m=+232.285526478" Apr 23 13:36:00.701154 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.701121 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:36:00.701779 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.701731 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="kube-rbac-proxy" containerID="cri-o://951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444" gracePeriod=600 Apr 23 13:36:00.701953 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.701730 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="prometheus" containerID="cri-o://c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b" gracePeriod=600 Apr 23 13:36:00.702006 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.701756 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="thanos-sidecar" containerID="cri-o://2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9" gracePeriod=600 Apr 23 13:36:00.702245 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.701761 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="kube-rbac-proxy-web" containerID="cri-o://5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720" gracePeriod=600 Apr 23 13:36:00.702245 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.701795 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="kube-rbac-proxy-thanos" containerID="cri-o://ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12" gracePeriod=600 Apr 23 13:36:00.702245 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.701796 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="config-reloader" containerID="cri-o://d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea" gracePeriod=600 Apr 23 13:36:00.934507 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.934485 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:00.950891 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.950871 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-kubelet-serving-ca-bundle\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951014 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.950898 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-kube-rbac-proxy\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951014 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.950921 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-web-config\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951014 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.950944 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-thanos-prometheus-http-client-file\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951014 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.950971 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-serving-certs-ca-bundle\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951239 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951014 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-k8s-rulefiles-0\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951239 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951048 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-k8s-db\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951239 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951081 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/09b99f39-20d2-44f1-92ac-26437b23a8f2-tls-assets\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951239 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951107 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-metrics-client-ca\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951239 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951143 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951239 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951171 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951239 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951200 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-metrics-client-certs\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951239 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951239 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-config\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951649 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951263 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-tls\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951649 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951303 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/09b99f39-20d2-44f1-92ac-26437b23a8f2-config-out\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951649 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951331 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-trusted-ca-bundle\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951649 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951329 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:00.951649 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951362 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkgsb\" (UniqueName: \"kubernetes.io/projected/09b99f39-20d2-44f1-92ac-26437b23a8f2-kube-api-access-hkgsb\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951649 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951391 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-grpc-tls\") pod \"09b99f39-20d2-44f1-92ac-26437b23a8f2\" (UID: \"09b99f39-20d2-44f1-92ac-26437b23a8f2\") " Apr 23 13:36:00.951649 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951386 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:00.952164 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951806 2581 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:00.952164 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951826 2581 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:00.952164 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.951832 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:00.953611 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.952662 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:36:00.953611 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.952843 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:00.953871 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.953830 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:00.953996 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.953947 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:00.954219 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.954177 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:00.954544 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.954486 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:00.954645 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.954554 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b99f39-20d2-44f1-92ac-26437b23a8f2-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:36:00.954836 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.954794 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:00.955564 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.955502 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b99f39-20d2-44f1-92ac-26437b23a8f2-kube-api-access-hkgsb" (OuterVolumeSpecName: "kube-api-access-hkgsb") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "kube-api-access-hkgsb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:36:00.955949 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.955928 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:00.956165 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.956145 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09b99f39-20d2-44f1-92ac-26437b23a8f2-config-out" (OuterVolumeSpecName: "config-out") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:36:00.956215 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.956177 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:00.956899 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.956878 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:00.956952 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.956905 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-config" (OuterVolumeSpecName: "config") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:00.970199 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:00.970175 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-web-config" (OuterVolumeSpecName: "web-config") pod "09b99f39-20d2-44f1-92ac-26437b23a8f2" (UID: "09b99f39-20d2-44f1-92ac-26437b23a8f2"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:01.052473 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.052449 2581 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:01.052473 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.052472 2581 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-metrics-client-certs\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:01.052626 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.052482 2581 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-config\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:01.052626 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.052492 2581 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:01.052626 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.052501 2581 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/09b99f39-20d2-44f1-92ac-26437b23a8f2-config-out\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:01.052626 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.052509 2581 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:01.052626 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.052518 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hkgsb\" (UniqueName: \"kubernetes.io/projected/09b99f39-20d2-44f1-92ac-26437b23a8f2-kube-api-access-hkgsb\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:01.052626 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.052544 2581 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-grpc-tls\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:01.052626 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.052553 2581 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-kube-rbac-proxy\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:01.052626 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.052561 2581 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-web-config\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:01.052626 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.052570 2581 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:01.052626 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.052578 2581 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:01.052626 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.052587 2581 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/09b99f39-20d2-44f1-92ac-26437b23a8f2-prometheus-k8s-db\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:01.052626 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.052595 2581 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/09b99f39-20d2-44f1-92ac-26437b23a8f2-tls-assets\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:01.052626 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.052603 2581 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09b99f39-20d2-44f1-92ac-26437b23a8f2-configmap-metrics-client-ca\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:01.052626 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.052612 2581 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/09b99f39-20d2-44f1-92ac-26437b23a8f2-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-103.ec2.internal\" DevicePath \"\"" Apr 23 13:36:01.325227 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.325140 2581 generic.go:358] "Generic (PLEG): container finished" podID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerID="ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12" exitCode=0 Apr 23 13:36:01.325227 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.325165 2581 generic.go:358] "Generic (PLEG): container finished" podID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerID="951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444" exitCode=0 Apr 23 13:36:01.325227 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.325172 2581 generic.go:358] "Generic (PLEG): container finished" podID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerID="5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720" exitCode=0 Apr 23 13:36:01.325227 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.325180 2581 generic.go:358] "Generic (PLEG): container finished" podID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerID="2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9" exitCode=0 Apr 23 13:36:01.325227 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.325184 2581 generic.go:358] "Generic (PLEG): container finished" podID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerID="d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea" exitCode=0 Apr 23 13:36:01.325227 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.325191 2581 generic.go:358] "Generic (PLEG): container finished" podID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerID="c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b" exitCode=0 Apr 23 13:36:01.325227 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.325214 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09b99f39-20d2-44f1-92ac-26437b23a8f2","Type":"ContainerDied","Data":"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12"} Apr 23 13:36:01.325853 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.325262 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09b99f39-20d2-44f1-92ac-26437b23a8f2","Type":"ContainerDied","Data":"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444"} Apr 23 13:36:01.325853 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.325278 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09b99f39-20d2-44f1-92ac-26437b23a8f2","Type":"ContainerDied","Data":"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720"} Apr 23 13:36:01.325853 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.325285 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.325853 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.325291 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09b99f39-20d2-44f1-92ac-26437b23a8f2","Type":"ContainerDied","Data":"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9"} Apr 23 13:36:01.325853 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.325305 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09b99f39-20d2-44f1-92ac-26437b23a8f2","Type":"ContainerDied","Data":"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea"} Apr 23 13:36:01.325853 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.325318 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09b99f39-20d2-44f1-92ac-26437b23a8f2","Type":"ContainerDied","Data":"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b"} Apr 23 13:36:01.325853 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.325331 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09b99f39-20d2-44f1-92ac-26437b23a8f2","Type":"ContainerDied","Data":"8ffc0adaf0b50b1155b9ea799d184c9113cf67042aaae7c311add66325fb4d93"} Apr 23 13:36:01.325853 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.325349 2581 scope.go:117] "RemoveContainer" containerID="ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12" Apr 23 13:36:01.333015 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.332973 2581 scope.go:117] "RemoveContainer" containerID="951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444" Apr 23 13:36:01.340856 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.340839 2581 scope.go:117] "RemoveContainer" containerID="5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720" Apr 23 13:36:01.347774 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.347750 2581 scope.go:117] "RemoveContainer" containerID="2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9" Apr 23 13:36:01.349407 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.349384 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:36:01.353375 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.353358 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:36:01.356451 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.356433 2581 scope.go:117] "RemoveContainer" containerID="d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea" Apr 23 13:36:01.362754 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.362740 2581 scope.go:117] "RemoveContainer" containerID="c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b" Apr 23 13:36:01.369276 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.369260 2581 scope.go:117] "RemoveContainer" containerID="5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1" Apr 23 13:36:01.375515 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.375483 2581 scope.go:117] "RemoveContainer" containerID="ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12" Apr 23 13:36:01.375814 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:36:01.375789 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12\": container with ID starting with ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12 not found: ID does not exist" containerID="ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12" Apr 23 13:36:01.375880 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.375825 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12"} err="failed to get container status \"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12\": rpc error: code = NotFound desc = could not find container \"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12\": container with ID starting with ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12 not found: ID does not exist" Apr 23 13:36:01.375880 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.375851 2581 scope.go:117] "RemoveContainer" containerID="951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444" Apr 23 13:36:01.376016 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.375998 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:36:01.376091 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:36:01.376072 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444\": container with ID starting with 951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444 not found: ID does not exist" containerID="951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444" Apr 23 13:36:01.376133 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376108 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444"} err="failed to get container status \"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444\": rpc error: code = NotFound desc = could not find container \"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444\": container with ID starting with 951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444 not found: ID does not exist" Apr 23 13:36:01.376133 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376127 2581 scope.go:117] "RemoveContainer" containerID="5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720" Apr 23 13:36:01.376292 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376277 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="kube-rbac-proxy-thanos" Apr 23 13:36:01.376346 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376297 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="kube-rbac-proxy-thanos" Apr 23 13:36:01.376346 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376312 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="kube-rbac-proxy-web" Apr 23 13:36:01.376346 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376320 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="kube-rbac-proxy-web" Apr 23 13:36:01.376346 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:36:01.376315 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720\": container with ID starting with 5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720 not found: ID does not exist" containerID="5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720" Apr 23 13:36:01.376346 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376340 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="thanos-sidecar" Apr 23 13:36:01.376517 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376349 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="thanos-sidecar" Apr 23 13:36:01.376517 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376341 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720"} err="failed to get container status \"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720\": rpc error: code = NotFound desc = could not find container \"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720\": container with ID starting with 5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720 not found: ID does not exist" Apr 23 13:36:01.376517 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376363 2581 scope.go:117] "RemoveContainer" containerID="2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9" Apr 23 13:36:01.376517 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376363 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="init-config-reloader" Apr 23 13:36:01.376517 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376408 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="init-config-reloader" Apr 23 13:36:01.376517 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376421 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="config-reloader" Apr 23 13:36:01.376517 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376429 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="config-reloader" Apr 23 13:36:01.376517 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376442 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="kube-rbac-proxy" Apr 23 13:36:01.376517 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376451 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="kube-rbac-proxy" Apr 23 13:36:01.376517 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376460 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="prometheus" Apr 23 13:36:01.376517 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376465 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="prometheus" Apr 23 13:36:01.376920 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376577 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="kube-rbac-proxy-thanos" Apr 23 13:36:01.376920 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376593 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="config-reloader" Apr 23 13:36:01.376920 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376606 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="prometheus" Apr 23 13:36:01.376920 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376615 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="thanos-sidecar" Apr 23 13:36:01.376920 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:36:01.376613 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9\": container with ID starting with 2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9 not found: ID does not exist" containerID="2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9" Apr 23 13:36:01.376920 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376639 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9"} err="failed to get container status \"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9\": rpc error: code = NotFound desc = could not find container \"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9\": container with ID starting with 2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9 not found: ID does not exist" Apr 23 13:36:01.376920 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376658 2581 scope.go:117] "RemoveContainer" containerID="d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea" Apr 23 13:36:01.376920 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376625 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="kube-rbac-proxy-web" Apr 23 13:36:01.376920 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376708 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" containerName="kube-rbac-proxy" Apr 23 13:36:01.377209 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:36:01.376925 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea\": container with ID starting with d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea not found: ID does not exist" containerID="d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea" Apr 23 13:36:01.377209 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376950 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea"} err="failed to get container status \"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea\": rpc error: code = NotFound desc = could not find container \"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea\": container with ID starting with d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea not found: ID does not exist" Apr 23 13:36:01.377209 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.376969 2581 scope.go:117] "RemoveContainer" containerID="c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b" Apr 23 13:36:01.377209 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:36:01.377170 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b\": container with ID starting with c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b not found: ID does not exist" containerID="c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b" Apr 23 13:36:01.377209 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.377191 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b"} err="failed to get container status \"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b\": rpc error: code = NotFound desc = could not find container \"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b\": container with ID starting with c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b not found: ID does not exist" Apr 23 13:36:01.377390 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.377209 2581 scope.go:117] "RemoveContainer" containerID="5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1" Apr 23 13:36:01.377440 ip-10-0-129-103 kubenswrapper[2581]: E0423 13:36:01.377424 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1\": container with ID starting with 5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1 not found: ID does not exist" containerID="5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1" Apr 23 13:36:01.377482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.377446 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1"} err="failed to get container status \"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1\": rpc error: code = NotFound desc = could not find container \"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1\": container with ID starting with 5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1 not found: ID does not exist" Apr 23 13:36:01.377482 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.377466 2581 scope.go:117] "RemoveContainer" containerID="ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12" Apr 23 13:36:01.377804 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.377739 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12"} err="failed to get container status \"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12\": rpc error: code = NotFound desc = could not find container \"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12\": container with ID starting with ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12 not found: ID does not exist" Apr 23 13:36:01.377804 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.377762 2581 scope.go:117] "RemoveContainer" containerID="951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444" Apr 23 13:36:01.378019 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.377999 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444"} err="failed to get container status \"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444\": rpc error: code = NotFound desc = could not find container \"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444\": container with ID starting with 951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444 not found: ID does not exist" Apr 23 13:36:01.378061 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.378020 2581 scope.go:117] "RemoveContainer" containerID="5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720" Apr 23 13:36:01.378247 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.378227 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720"} err="failed to get container status \"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720\": rpc error: code = NotFound desc = could not find container \"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720\": container with ID starting with 5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720 not found: ID does not exist" Apr 23 13:36:01.378335 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.378250 2581 scope.go:117] "RemoveContainer" containerID="2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9" Apr 23 13:36:01.378475 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.378459 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9"} err="failed to get container status \"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9\": rpc error: code = NotFound desc = could not find container \"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9\": container with ID starting with 2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9 not found: ID does not exist" Apr 23 13:36:01.378582 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.378477 2581 scope.go:117] "RemoveContainer" containerID="d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea" Apr 23 13:36:01.378741 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.378723 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea"} err="failed to get container status \"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea\": rpc error: code = NotFound desc = could not find container \"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea\": container with ID starting with d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea not found: ID does not exist" Apr 23 13:36:01.378791 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.378742 2581 scope.go:117] "RemoveContainer" containerID="c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b" Apr 23 13:36:01.378933 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.378914 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b"} err="failed to get container status \"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b\": rpc error: code = NotFound desc = could not find container \"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b\": container with ID starting with c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b not found: ID does not exist" Apr 23 13:36:01.379030 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.378934 2581 scope.go:117] "RemoveContainer" containerID="5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1" Apr 23 13:36:01.379140 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.379120 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1"} err="failed to get container status \"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1\": rpc error: code = NotFound desc = could not find container \"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1\": container with ID starting with 5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1 not found: ID does not exist" Apr 23 13:36:01.379182 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.379140 2581 scope.go:117] "RemoveContainer" containerID="ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12" Apr 23 13:36:01.379338 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.379323 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12"} err="failed to get container status \"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12\": rpc error: code = NotFound desc = could not find container \"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12\": container with ID starting with ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12 not found: ID does not exist" Apr 23 13:36:01.379383 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.379338 2581 scope.go:117] "RemoveContainer" containerID="951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444" Apr 23 13:36:01.379517 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.379501 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444"} err="failed to get container status \"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444\": rpc error: code = NotFound desc = could not find container \"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444\": container with ID starting with 951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444 not found: ID does not exist" Apr 23 13:36:01.379606 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.379518 2581 scope.go:117] "RemoveContainer" containerID="5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720" Apr 23 13:36:01.379737 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.379721 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720"} err="failed to get container status \"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720\": rpc error: code = NotFound desc = could not find container \"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720\": container with ID starting with 5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720 not found: ID does not exist" Apr 23 13:36:01.379779 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.379737 2581 scope.go:117] "RemoveContainer" containerID="2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9" Apr 23 13:36:01.379895 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.379880 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9"} err="failed to get container status \"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9\": rpc error: code = NotFound desc = could not find container \"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9\": container with ID starting with 2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9 not found: ID does not exist" Apr 23 13:36:01.379939 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.379895 2581 scope.go:117] "RemoveContainer" containerID="d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea" Apr 23 13:36:01.380072 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.380054 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea"} err="failed to get container status \"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea\": rpc error: code = NotFound desc = could not find container \"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea\": container with ID starting with d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea not found: ID does not exist" Apr 23 13:36:01.380130 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.380073 2581 scope.go:117] "RemoveContainer" containerID="c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b" Apr 23 13:36:01.380229 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.380213 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b"} err="failed to get container status \"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b\": rpc error: code = NotFound desc = could not find container \"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b\": container with ID starting with c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b not found: ID does not exist" Apr 23 13:36:01.380287 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.380230 2581 scope.go:117] "RemoveContainer" containerID="5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1" Apr 23 13:36:01.380417 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.380399 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1"} err="failed to get container status \"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1\": rpc error: code = NotFound desc = could not find container \"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1\": container with ID starting with 5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1 not found: ID does not exist" Apr 23 13:36:01.380465 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.380419 2581 scope.go:117] "RemoveContainer" containerID="ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12" Apr 23 13:36:01.380627 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.380606 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12"} err="failed to get container status \"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12\": rpc error: code = NotFound desc = could not find container \"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12\": container with ID starting with ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12 not found: ID does not exist" Apr 23 13:36:01.380680 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.380629 2581 scope.go:117] "RemoveContainer" containerID="951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444" Apr 23 13:36:01.380821 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.380804 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444"} err="failed to get container status \"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444\": rpc error: code = NotFound desc = could not find container \"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444\": container with ID starting with 951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444 not found: ID does not exist" Apr 23 13:36:01.380885 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.380822 2581 scope.go:117] "RemoveContainer" containerID="5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720" Apr 23 13:36:01.381005 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.380990 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720"} err="failed to get container status \"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720\": rpc error: code = NotFound desc = could not find container \"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720\": container with ID starting with 5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720 not found: ID does not exist" Apr 23 13:36:01.381048 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.381005 2581 scope.go:117] "RemoveContainer" containerID="2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9" Apr 23 13:36:01.381174 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.381160 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9"} err="failed to get container status \"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9\": rpc error: code = NotFound desc = could not find container \"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9\": container with ID starting with 2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9 not found: ID does not exist" Apr 23 13:36:01.381222 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.381174 2581 scope.go:117] "RemoveContainer" containerID="d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea" Apr 23 13:36:01.381354 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.381336 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea"} err="failed to get container status \"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea\": rpc error: code = NotFound desc = could not find container \"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea\": container with ID starting with d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea not found: ID does not exist" Apr 23 13:36:01.381354 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.381352 2581 scope.go:117] "RemoveContainer" containerID="c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b" Apr 23 13:36:01.381558 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.381537 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b"} err="failed to get container status \"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b\": rpc error: code = NotFound desc = could not find container \"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b\": container with ID starting with c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b not found: ID does not exist" Apr 23 13:36:01.381558 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.381557 2581 scope.go:117] "RemoveContainer" containerID="5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1" Apr 23 13:36:01.381757 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.381737 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1"} err="failed to get container status \"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1\": rpc error: code = NotFound desc = could not find container \"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1\": container with ID starting with 5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1 not found: ID does not exist" Apr 23 13:36:01.381807 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.381759 2581 scope.go:117] "RemoveContainer" containerID="ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12" Apr 23 13:36:01.381932 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.381918 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.382038 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.381917 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12"} err="failed to get container status \"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12\": rpc error: code = NotFound desc = could not find container \"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12\": container with ID starting with ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12 not found: ID does not exist" Apr 23 13:36:01.382096 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.382046 2581 scope.go:117] "RemoveContainer" containerID="951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444" Apr 23 13:36:01.382503 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.382462 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444"} err="failed to get container status \"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444\": rpc error: code = NotFound desc = could not find container \"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444\": container with ID starting with 951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444 not found: ID does not exist" Apr 23 13:36:01.382594 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.382505 2581 scope.go:117] "RemoveContainer" containerID="5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720" Apr 23 13:36:01.382759 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.382744 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720"} err="failed to get container status \"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720\": rpc error: code = NotFound desc = could not find container \"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720\": container with ID starting with 5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720 not found: ID does not exist" Apr 23 13:36:01.382805 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.382760 2581 scope.go:117] "RemoveContainer" containerID="2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9" Apr 23 13:36:01.382976 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.382959 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9"} err="failed to get container status \"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9\": rpc error: code = NotFound desc = could not find container \"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9\": container with ID starting with 2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9 not found: ID does not exist" Apr 23 13:36:01.383115 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.382978 2581 scope.go:117] "RemoveContainer" containerID="d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea" Apr 23 13:36:01.383205 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.383189 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea"} err="failed to get container status \"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea\": rpc error: code = NotFound desc = could not find container \"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea\": container with ID starting with d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea not found: ID does not exist" Apr 23 13:36:01.383243 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.383206 2581 scope.go:117] "RemoveContainer" containerID="c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b" Apr 23 13:36:01.383422 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.383405 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b"} err="failed to get container status \"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b\": rpc error: code = NotFound desc = could not find container \"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b\": container with ID starting with c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b not found: ID does not exist" Apr 23 13:36:01.383465 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.383422 2581 scope.go:117] "RemoveContainer" containerID="5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1" Apr 23 13:36:01.383677 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.383658 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1"} err="failed to get container status \"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1\": rpc error: code = NotFound desc = could not find container \"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1\": container with ID starting with 5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1 not found: ID does not exist" Apr 23 13:36:01.383677 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.383678 2581 scope.go:117] "RemoveContainer" containerID="ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12" Apr 23 13:36:01.383895 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.383879 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12"} err="failed to get container status \"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12\": rpc error: code = NotFound desc = could not find container \"ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12\": container with ID starting with ba5281e24f68cf701142735159f580663befcce2ccca0c9823f86587fa7f8f12 not found: ID does not exist" Apr 23 13:36:01.383943 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.383896 2581 scope.go:117] "RemoveContainer" containerID="951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444" Apr 23 13:36:01.384140 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.384118 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444"} err="failed to get container status \"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444\": rpc error: code = NotFound desc = could not find container \"951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444\": container with ID starting with 951c67b774cf025939d541c7f922e8ee048333202b205f77e3309c7af9822444 not found: ID does not exist" Apr 23 13:36:01.384207 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.384142 2581 scope.go:117] "RemoveContainer" containerID="5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720" Apr 23 13:36:01.384394 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.384375 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 13:36:01.384469 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.384441 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 13:36:01.384469 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.384464 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 13:36:01.384661 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.384487 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-5rw9w\"" Apr 23 13:36:01.384661 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.384494 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 13:36:01.384661 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.384627 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720"} err="failed to get container status \"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720\": rpc error: code = NotFound desc = could not find container \"5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720\": container with ID starting with 5c9a1682b9cf186ccd265b52a3dabc7add8daa2b0793853fe411547f665bb720 not found: ID does not exist" Apr 23 13:36:01.384661 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.384649 2581 scope.go:117] "RemoveContainer" containerID="2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9" Apr 23 13:36:01.384850 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.384712 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 13:36:01.384925 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.384900 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9"} err="failed to get container status \"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9\": rpc error: code = NotFound desc = could not find container \"2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9\": container with ID starting with 2c7817d01f4d289b7567e7005f883085d7cb2b00732fa184cf35a27f38e597d9 not found: ID does not exist" Apr 23 13:36:01.384977 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.384929 2581 scope.go:117] "RemoveContainer" containerID="d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea" Apr 23 13:36:01.384977 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.384953 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 13:36:01.384977 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.384960 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5162j7pc9hkj7\"" Apr 23 13:36:01.385200 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.385177 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 13:36:01.385261 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.385232 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 13:36:01.385261 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.385253 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 13:36:01.385396 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.385369 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea"} err="failed to get container status \"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea\": rpc error: code = NotFound desc = could not find container \"d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea\": container with ID starting with d24906aa5c6d66c323ca4d1e18cd6b5b8425454cec8165a623ae5063aabf9eea not found: ID does not exist" Apr 23 13:36:01.385396 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.385395 2581 scope.go:117] "RemoveContainer" containerID="c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b" Apr 23 13:36:01.385581 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.385562 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 13:36:01.385673 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.385659 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 13:36:01.385731 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.385686 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b"} err="failed to get container status \"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b\": rpc error: code = NotFound desc = could not find container \"c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b\": container with ID starting with c4c8c505054716ee9eb0aacd764e706d58fb2d50f18be3ed58a543713aa8a04b not found: ID does not exist" Apr 23 13:36:01.385731 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.385722 2581 scope.go:117] "RemoveContainer" containerID="5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1" Apr 23 13:36:01.386028 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.386005 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1"} err="failed to get container status \"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1\": rpc error: code = NotFound desc = could not find container \"5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1\": container with ID starting with 5bded86a62c129593e86c1ec75112573753cff95673bed1884fe307bb8cb9bc1 not found: ID does not exist" Apr 23 13:36:01.387489 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.387469 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 13:36:01.392013 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.391991 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 13:36:01.393487 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.393469 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:36:01.456184 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456159 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.456265 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456187 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/26898d76-0d57-4716-8798-58827a3a91ad-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.456265 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456206 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26898d76-0d57-4716-8798-58827a3a91ad-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.456265 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456232 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26898d76-0d57-4716-8798-58827a3a91ad-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.456364 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456288 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.456364 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456317 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/26898d76-0d57-4716-8798-58827a3a91ad-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.456364 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456336 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-config\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.456456 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456401 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.456456 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456423 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.456456 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456442 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.456561 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456461 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-web-config\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.456561 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456477 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.456561 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456510 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrlxr\" (UniqueName: \"kubernetes.io/projected/26898d76-0d57-4716-8798-58827a3a91ad-kube-api-access-wrlxr\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.456649 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456547 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/26898d76-0d57-4716-8798-58827a3a91ad-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.456649 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456585 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.456649 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456603 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26898d76-0d57-4716-8798-58827a3a91ad-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.456649 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456622 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/26898d76-0d57-4716-8798-58827a3a91ad-config-out\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.456649 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.456640 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26898d76-0d57-4716-8798-58827a3a91ad-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.557469 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.557450 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26898d76-0d57-4716-8798-58827a3a91ad-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.557572 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.557477 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.557572 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.557493 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/26898d76-0d57-4716-8798-58827a3a91ad-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.557572 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.557508 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-config\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.557702 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.557674 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.557751 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.557711 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.557751 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.557744 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.557847 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.557774 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-web-config\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.557847 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.557798 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.557847 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.557826 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrlxr\" (UniqueName: \"kubernetes.io/projected/26898d76-0d57-4716-8798-58827a3a91ad-kube-api-access-wrlxr\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.557983 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.557855 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/26898d76-0d57-4716-8798-58827a3a91ad-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.557983 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.557894 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.557983 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.557924 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26898d76-0d57-4716-8798-58827a3a91ad-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.557983 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.557959 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/26898d76-0d57-4716-8798-58827a3a91ad-config-out\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.558173 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.557988 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26898d76-0d57-4716-8798-58827a3a91ad-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.558173 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.558013 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.558173 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.558040 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/26898d76-0d57-4716-8798-58827a3a91ad-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.558173 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.558072 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26898d76-0d57-4716-8798-58827a3a91ad-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.558377 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.558248 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26898d76-0d57-4716-8798-58827a3a91ad-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.559794 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.558937 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26898d76-0d57-4716-8798-58827a3a91ad-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.560600 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.560574 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-config\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.560720 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.560697 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.560933 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.560913 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.561077 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.561055 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.561155 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.557854 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/26898d76-0d57-4716-8798-58827a3a91ad-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.561155 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.561095 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.561390 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.561346 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-web-config\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.561652 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.561631 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26898d76-0d57-4716-8798-58827a3a91ad-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.562029 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.562006 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26898d76-0d57-4716-8798-58827a3a91ad-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.562073 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.562025 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.562749 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.562729 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/26898d76-0d57-4716-8798-58827a3a91ad-config-out\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.563170 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.563151 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.563455 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.563437 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/26898d76-0d57-4716-8798-58827a3a91ad-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.563579 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.563564 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/26898d76-0d57-4716-8798-58827a3a91ad-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.563916 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.563900 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/26898d76-0d57-4716-8798-58827a3a91ad-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.566881 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.566858 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrlxr\" (UniqueName: \"kubernetes.io/projected/26898d76-0d57-4716-8798-58827a3a91ad-kube-api-access-wrlxr\") pod \"prometheus-k8s-0\" (UID: \"26898d76-0d57-4716-8798-58827a3a91ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.693241 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.693210 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:01.819427 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:01.819400 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:36:01.821827 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:36:01.821798 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26898d76_0d57_4716_8798_58827a3a91ad.slice/crio-947ff39dfddefbbbcbe9a44308c7e07d4c07bd65666044946a35dcff7f3b948a WatchSource:0}: Error finding container 947ff39dfddefbbbcbe9a44308c7e07d4c07bd65666044946a35dcff7f3b948a: Status 404 returned error can't find the container with id 947ff39dfddefbbbcbe9a44308c7e07d4c07bd65666044946a35dcff7f3b948a Apr 23 13:36:02.331230 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:02.331197 2581 generic.go:358] "Generic (PLEG): container finished" podID="26898d76-0d57-4716-8798-58827a3a91ad" containerID="dabd16467ae31d456af5df466f91d47302c74627c2cfde821570b2c2dc92c880" exitCode=0 Apr 23 13:36:02.331596 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:02.331259 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"26898d76-0d57-4716-8798-58827a3a91ad","Type":"ContainerDied","Data":"dabd16467ae31d456af5df466f91d47302c74627c2cfde821570b2c2dc92c880"} Apr 23 13:36:02.331596 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:02.331280 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"26898d76-0d57-4716-8798-58827a3a91ad","Type":"ContainerStarted","Data":"947ff39dfddefbbbcbe9a44308c7e07d4c07bd65666044946a35dcff7f3b948a"} Apr 23 13:36:02.579294 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:02.579259 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09b99f39-20d2-44f1-92ac-26437b23a8f2" path="/var/lib/kubelet/pods/09b99f39-20d2-44f1-92ac-26437b23a8f2/volumes" Apr 23 13:36:03.338468 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:03.338434 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"26898d76-0d57-4716-8798-58827a3a91ad","Type":"ContainerStarted","Data":"042201c8d1bddd80118d91faec0f65c7aabb449bee2fdecc18e2e3c4055dc5ad"} Apr 23 13:36:03.338807 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:03.338474 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"26898d76-0d57-4716-8798-58827a3a91ad","Type":"ContainerStarted","Data":"f3e8b0a3c1464859a848693bfc3ffd41c80bed14f5c48444ec190979f5e6a0bc"} Apr 23 13:36:03.338807 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:03.338485 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"26898d76-0d57-4716-8798-58827a3a91ad","Type":"ContainerStarted","Data":"d86418bbdb45916078c80a405c19b885f0dcbafa37b320811a5d6036e98738bb"} Apr 23 13:36:03.338807 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:03.338494 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"26898d76-0d57-4716-8798-58827a3a91ad","Type":"ContainerStarted","Data":"c8887701f50ebbc4c200cbc1d9a95d07fcc3a8212ea23b688d23262e0420b7e6"} Apr 23 13:36:03.338807 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:03.338502 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"26898d76-0d57-4716-8798-58827a3a91ad","Type":"ContainerStarted","Data":"2e07cc14a26b591818e72ab14407c47d5f2a97ed6f49c14b82a9f3b0b28c950f"} Apr 23 13:36:03.338807 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:03.338510 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"26898d76-0d57-4716-8798-58827a3a91ad","Type":"ContainerStarted","Data":"654b66ef6875647250dd84364065f40129886397189cfe34df2fdcbbd04d9b90"} Apr 23 13:36:03.366195 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:03.366144 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.36612717 podStartE2EDuration="2.36612717s" podCreationTimestamp="2026-04-23 13:36:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:36:03.364546793 +0000 UTC m=+235.304721243" watchObservedRunningTime="2026-04-23 13:36:03.36612717 +0000 UTC m=+235.306301630" Apr 23 13:36:06.693828 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:36:06.693791 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:37:01.693358 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:37:01.693324 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:37:01.708561 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:37:01.708519 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:37:02.522132 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:37:02.522106 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:37:08.488485 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:37:08.488456 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/ovn-acl-logging/0.log" Apr 23 13:37:08.489013 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:37:08.488605 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/ovn-acl-logging/0.log" Apr 23 13:37:08.495731 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:37:08.495713 2581 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 13:42:08.508960 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:42:08.508927 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/ovn-acl-logging/0.log" Apr 23 13:42:08.510057 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:42:08.510038 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/ovn-acl-logging/0.log" Apr 23 13:47:08.530816 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:47:08.530741 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/ovn-acl-logging/0.log" Apr 23 13:47:08.533878 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:47:08.533010 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/ovn-acl-logging/0.log" Apr 23 13:52:08.553871 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:52:08.553841 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/ovn-acl-logging/0.log" Apr 23 13:52:08.556549 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:52:08.556510 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/ovn-acl-logging/0.log" Apr 23 13:54:42.620460 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:42.620378 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6sl86/must-gather-6rtgw"] Apr 23 13:54:42.623660 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:42.623639 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sl86/must-gather-6rtgw" Apr 23 13:54:42.626348 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:42.626320 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6sl86\"/\"default-dockercfg-dr2cm\"" Apr 23 13:54:42.626600 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:42.626582 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sl86\"/\"openshift-service-ca.crt\"" Apr 23 13:54:42.627088 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:42.627069 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sl86\"/\"kube-root-ca.crt\"" Apr 23 13:54:42.632187 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:42.632165 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sl86/must-gather-6rtgw"] Apr 23 13:54:42.652375 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:42.652352 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b0e9a55d-73db-4b7c-9e0c-661e88e56c9c-must-gather-output\") pod \"must-gather-6rtgw\" (UID: \"b0e9a55d-73db-4b7c-9e0c-661e88e56c9c\") " pod="openshift-must-gather-6sl86/must-gather-6rtgw" Apr 23 13:54:42.652454 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:42.652392 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxk7q\" (UniqueName: \"kubernetes.io/projected/b0e9a55d-73db-4b7c-9e0c-661e88e56c9c-kube-api-access-hxk7q\") pod \"must-gather-6rtgw\" (UID: \"b0e9a55d-73db-4b7c-9e0c-661e88e56c9c\") " pod="openshift-must-gather-6sl86/must-gather-6rtgw" Apr 23 13:54:42.753702 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:42.753681 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxk7q\" (UniqueName: \"kubernetes.io/projected/b0e9a55d-73db-4b7c-9e0c-661e88e56c9c-kube-api-access-hxk7q\") pod \"must-gather-6rtgw\" (UID: \"b0e9a55d-73db-4b7c-9e0c-661e88e56c9c\") " pod="openshift-must-gather-6sl86/must-gather-6rtgw" Apr 23 13:54:42.753782 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:42.753736 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b0e9a55d-73db-4b7c-9e0c-661e88e56c9c-must-gather-output\") pod \"must-gather-6rtgw\" (UID: \"b0e9a55d-73db-4b7c-9e0c-661e88e56c9c\") " pod="openshift-must-gather-6sl86/must-gather-6rtgw" Apr 23 13:54:42.753977 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:42.753964 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b0e9a55d-73db-4b7c-9e0c-661e88e56c9c-must-gather-output\") pod \"must-gather-6rtgw\" (UID: \"b0e9a55d-73db-4b7c-9e0c-661e88e56c9c\") " pod="openshift-must-gather-6sl86/must-gather-6rtgw" Apr 23 13:54:42.761958 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:42.761935 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxk7q\" (UniqueName: \"kubernetes.io/projected/b0e9a55d-73db-4b7c-9e0c-661e88e56c9c-kube-api-access-hxk7q\") pod \"must-gather-6rtgw\" (UID: \"b0e9a55d-73db-4b7c-9e0c-661e88e56c9c\") " pod="openshift-must-gather-6sl86/must-gather-6rtgw" Apr 23 13:54:42.933160 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:42.933142 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sl86/must-gather-6rtgw" Apr 23 13:54:43.043177 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:43.043147 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sl86/must-gather-6rtgw"] Apr 23 13:54:43.045878 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:54:43.045846 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0e9a55d_73db_4b7c_9e0c_661e88e56c9c.slice/crio-b57bbe19fa467016aec9512a4c52ccd08aff2f105612dc23a58fbcb552a7ea62 WatchSource:0}: Error finding container b57bbe19fa467016aec9512a4c52ccd08aff2f105612dc23a58fbcb552a7ea62: Status 404 returned error can't find the container with id b57bbe19fa467016aec9512a4c52ccd08aff2f105612dc23a58fbcb552a7ea62 Apr 23 13:54:43.047618 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:43.047601 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:54:43.219969 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:43.219909 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sl86/must-gather-6rtgw" event={"ID":"b0e9a55d-73db-4b7c-9e0c-661e88e56c9c","Type":"ContainerStarted","Data":"b57bbe19fa467016aec9512a4c52ccd08aff2f105612dc23a58fbcb552a7ea62"} Apr 23 13:54:44.225382 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:44.225337 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sl86/must-gather-6rtgw" event={"ID":"b0e9a55d-73db-4b7c-9e0c-661e88e56c9c","Type":"ContainerStarted","Data":"45e2d18eb9a1ae87c1d51f9a700e365559f538b5b9d3ec3b8c39d2118b6a5822"} Apr 23 13:54:44.225382 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:44.225388 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sl86/must-gather-6rtgw" event={"ID":"b0e9a55d-73db-4b7c-9e0c-661e88e56c9c","Type":"ContainerStarted","Data":"fe55cf90c1bf168cb64d3f9f701e907730e6cf70ffa9a5e3593420b5bfd316d2"} Apr 23 13:54:44.244057 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:44.243991 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6sl86/must-gather-6rtgw" podStartSLOduration=1.390488253 podStartE2EDuration="2.243972892s" podCreationTimestamp="2026-04-23 13:54:42 +0000 UTC" firstStartedPulling="2026-04-23 13:54:43.047777181 +0000 UTC m=+1354.987951631" lastFinishedPulling="2026-04-23 13:54:43.901261834 +0000 UTC m=+1355.841436270" observedRunningTime="2026-04-23 13:54:44.242692152 +0000 UTC m=+1356.182866612" watchObservedRunningTime="2026-04-23 13:54:44.243972892 +0000 UTC m=+1356.184147352" Apr 23 13:54:45.287106 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:45.287077 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-jwjhj_7060a6e1-461e-45bb-85bd-9609300f9b17/global-pull-secret-syncer/0.log" Apr 23 13:54:45.471669 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:45.471637 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-h6brm_267b4640-e7c2-4100-9c7d-2623b5ee12fd/konnectivity-agent/0.log" Apr 23 13:54:45.529461 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:45.529433 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-103.ec2.internal_418931079fe802d41c31b61d5b4bcb82/haproxy/0.log" Apr 23 13:54:48.602341 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:48.602281 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_43e47fd6-0b15-47e7-a358-859466a20230/alertmanager/0.log" Apr 23 13:54:48.642181 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:48.642087 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_43e47fd6-0b15-47e7-a358-859466a20230/config-reloader/0.log" Apr 23 13:54:48.677682 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:48.677661 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_43e47fd6-0b15-47e7-a358-859466a20230/kube-rbac-proxy-web/0.log" Apr 23 13:54:48.714071 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:48.713993 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_43e47fd6-0b15-47e7-a358-859466a20230/kube-rbac-proxy/0.log" Apr 23 13:54:48.740303 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:48.740277 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_43e47fd6-0b15-47e7-a358-859466a20230/kube-rbac-proxy-metric/0.log" Apr 23 13:54:48.786623 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:48.786592 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_43e47fd6-0b15-47e7-a358-859466a20230/prom-label-proxy/0.log" Apr 23 13:54:48.814133 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:48.814093 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_43e47fd6-0b15-47e7-a358-859466a20230/init-config-reloader/0.log" Apr 23 13:54:49.010775 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:49.010654 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2qd4l_9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f/node-exporter/0.log" Apr 23 13:54:49.034465 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:49.034434 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2qd4l_9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f/kube-rbac-proxy/0.log" Apr 23 13:54:49.062420 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:49.062397 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2qd4l_9b0c9cfe-364c-4cc2-b3a7-574e52e2fe9f/init-textfile/0.log" Apr 23 13:54:49.345286 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:49.345206 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_26898d76-0d57-4716-8798-58827a3a91ad/prometheus/0.log" Apr 23 13:54:49.362953 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:49.362916 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_26898d76-0d57-4716-8798-58827a3a91ad/config-reloader/0.log" Apr 23 13:54:49.386027 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:49.386000 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_26898d76-0d57-4716-8798-58827a3a91ad/thanos-sidecar/0.log" Apr 23 13:54:49.407977 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:49.407936 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_26898d76-0d57-4716-8798-58827a3a91ad/kube-rbac-proxy-web/0.log" Apr 23 13:54:49.429620 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:49.429582 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_26898d76-0d57-4716-8798-58827a3a91ad/kube-rbac-proxy/0.log" Apr 23 13:54:49.455882 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:49.455851 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_26898d76-0d57-4716-8798-58827a3a91ad/kube-rbac-proxy-thanos/0.log" Apr 23 13:54:49.482367 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:49.482321 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_26898d76-0d57-4716-8798-58827a3a91ad/init-config-reloader/0.log" Apr 23 13:54:49.589609 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:49.589580 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-586489cc6d-wv59j_a151d0c0-6a27-40f5-b47a-6aa6b008093f/telemeter-client/0.log" Apr 23 13:54:49.610566 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:49.610467 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-586489cc6d-wv59j_a151d0c0-6a27-40f5-b47a-6aa6b008093f/reload/0.log" Apr 23 13:54:49.630916 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:49.630887 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-586489cc6d-wv59j_a151d0c0-6a27-40f5-b47a-6aa6b008093f/kube-rbac-proxy/0.log" Apr 23 13:54:52.408142 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.408111 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl"] Apr 23 13:54:52.412397 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.412368 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:52.422306 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.422278 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl"] Apr 23 13:54:52.546436 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.546394 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c16fdb9d-17dc-4917-9967-143265f11734-podres\") pod \"perf-node-gather-daemonset-5v4xl\" (UID: \"c16fdb9d-17dc-4917-9967-143265f11734\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:52.546603 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.546441 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c16fdb9d-17dc-4917-9967-143265f11734-lib-modules\") pod \"perf-node-gather-daemonset-5v4xl\" (UID: \"c16fdb9d-17dc-4917-9967-143265f11734\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:52.546603 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.546510 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c16fdb9d-17dc-4917-9967-143265f11734-sys\") pod \"perf-node-gather-daemonset-5v4xl\" (UID: \"c16fdb9d-17dc-4917-9967-143265f11734\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:52.546690 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.546640 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c16fdb9d-17dc-4917-9967-143265f11734-proc\") pod \"perf-node-gather-daemonset-5v4xl\" (UID: \"c16fdb9d-17dc-4917-9967-143265f11734\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:52.546690 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.546675 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf8cj\" (UniqueName: \"kubernetes.io/projected/c16fdb9d-17dc-4917-9967-143265f11734-kube-api-access-cf8cj\") pod \"perf-node-gather-daemonset-5v4xl\" (UID: \"c16fdb9d-17dc-4917-9967-143265f11734\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:52.647555 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.647506 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c16fdb9d-17dc-4917-9967-143265f11734-proc\") pod \"perf-node-gather-daemonset-5v4xl\" (UID: \"c16fdb9d-17dc-4917-9967-143265f11734\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:52.647709 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.647565 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cf8cj\" (UniqueName: \"kubernetes.io/projected/c16fdb9d-17dc-4917-9967-143265f11734-kube-api-access-cf8cj\") pod \"perf-node-gather-daemonset-5v4xl\" (UID: \"c16fdb9d-17dc-4917-9967-143265f11734\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:52.647709 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.647632 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c16fdb9d-17dc-4917-9967-143265f11734-podres\") pod \"perf-node-gather-daemonset-5v4xl\" (UID: \"c16fdb9d-17dc-4917-9967-143265f11734\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:52.647709 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.647669 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c16fdb9d-17dc-4917-9967-143265f11734-lib-modules\") pod \"perf-node-gather-daemonset-5v4xl\" (UID: \"c16fdb9d-17dc-4917-9967-143265f11734\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:52.647709 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.647702 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c16fdb9d-17dc-4917-9967-143265f11734-sys\") pod \"perf-node-gather-daemonset-5v4xl\" (UID: \"c16fdb9d-17dc-4917-9967-143265f11734\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:52.648038 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.647796 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c16fdb9d-17dc-4917-9967-143265f11734-sys\") pod \"perf-node-gather-daemonset-5v4xl\" (UID: \"c16fdb9d-17dc-4917-9967-143265f11734\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:52.648038 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.647886 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c16fdb9d-17dc-4917-9967-143265f11734-proc\") pod \"perf-node-gather-daemonset-5v4xl\" (UID: \"c16fdb9d-17dc-4917-9967-143265f11734\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:52.648038 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.647910 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c16fdb9d-17dc-4917-9967-143265f11734-lib-modules\") pod \"perf-node-gather-daemonset-5v4xl\" (UID: \"c16fdb9d-17dc-4917-9967-143265f11734\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:52.648038 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.647913 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c16fdb9d-17dc-4917-9967-143265f11734-podres\") pod \"perf-node-gather-daemonset-5v4xl\" (UID: \"c16fdb9d-17dc-4917-9967-143265f11734\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:52.656839 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.656820 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf8cj\" (UniqueName: \"kubernetes.io/projected/c16fdb9d-17dc-4917-9967-143265f11734-kube-api-access-cf8cj\") pod \"perf-node-gather-daemonset-5v4xl\" (UID: \"c16fdb9d-17dc-4917-9967-143265f11734\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:52.725972 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.724906 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:52.859852 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.859812 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl"] Apr 23 13:54:52.862670 ip-10-0-129-103 kubenswrapper[2581]: W0423 13:54:52.862641 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc16fdb9d_17dc_4917_9967_143265f11734.slice/crio-c8c5b5baa1ea3726ce36c6078cc0927056300a8cf9fdcd523b831541f180df5f WatchSource:0}: Error finding container c8c5b5baa1ea3726ce36c6078cc0927056300a8cf9fdcd523b831541f180df5f: Status 404 returned error can't find the container with id c8c5b5baa1ea3726ce36c6078cc0927056300a8cf9fdcd523b831541f180df5f Apr 23 13:54:52.905662 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.905643 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lk6nc_58b54dde-cfd8-43cb-8a0f-80463679527c/dns/0.log" Apr 23 13:54:52.925876 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.925857 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lk6nc_58b54dde-cfd8-43cb-8a0f-80463679527c/kube-rbac-proxy/0.log" Apr 23 13:54:52.992290 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:52.992244 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qgrhz_b35db408-8233-451b-984c-90d90df7a815/dns-node-resolver/0.log" Apr 23 13:54:53.256624 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:53.256507 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" event={"ID":"c16fdb9d-17dc-4917-9967-143265f11734","Type":"ContainerStarted","Data":"0161f30c2593ac681118532e649831586674e00e36fd0bcbf6c1e393a86d3b14"} Apr 23 13:54:53.256624 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:53.256578 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" event={"ID":"c16fdb9d-17dc-4917-9967-143265f11734","Type":"ContainerStarted","Data":"c8c5b5baa1ea3726ce36c6078cc0927056300a8cf9fdcd523b831541f180df5f"} Apr 23 13:54:53.257310 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:53.257283 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:54:53.273714 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:53.273661 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" podStartSLOduration=1.273644902 podStartE2EDuration="1.273644902s" podCreationTimestamp="2026-04-23 13:54:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:54:53.271366352 +0000 UTC m=+1365.211540811" watchObservedRunningTime="2026-04-23 13:54:53.273644902 +0000 UTC m=+1365.213819364" Apr 23 13:54:53.367871 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:53.367844 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-59f44d7dcb-lmrp5_1f89c38d-dd50-4f03-a48f-87d000b0dd2b/registry/0.log" Apr 23 13:54:53.415226 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:53.415202 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mx792_01fe271a-4dd9-4cd9-8fd7-07b0808bdb7c/node-ca/0.log" Apr 23 13:54:54.406889 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:54.406839 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-slr9t_9d0ae972-b2b8-41fe-a688-e9f33be2d8f1/serve-healthcheck-canary/0.log" Apr 23 13:54:54.922158 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:54.922134 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vc88g_74c3988c-0d03-4871-8bc1-3e6fe2005562/kube-rbac-proxy/0.log" Apr 23 13:54:54.942370 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:54.942350 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vc88g_74c3988c-0d03-4871-8bc1-3e6fe2005562/exporter/0.log" Apr 23 13:54:54.963356 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:54:54.963336 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vc88g_74c3988c-0d03-4871-8bc1-3e6fe2005562/extractor/0.log" Apr 23 13:55:00.275041 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:00.275011 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-5v4xl" Apr 23 13:55:01.879192 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:01.879118 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9vj47_8ebc3830-6349-407a-984d-4ca78ca8e182/kube-multus/0.log" Apr 23 13:55:02.234160 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:02.234106 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vwqk2_fe05d38e-d020-46dd-95d4-832fb5c93359/kube-multus-additional-cni-plugins/0.log" Apr 23 13:55:02.257299 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:02.257272 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vwqk2_fe05d38e-d020-46dd-95d4-832fb5c93359/egress-router-binary-copy/0.log" Apr 23 13:55:02.281055 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:02.281034 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vwqk2_fe05d38e-d020-46dd-95d4-832fb5c93359/cni-plugins/0.log" Apr 23 13:55:02.300245 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:02.300224 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vwqk2_fe05d38e-d020-46dd-95d4-832fb5c93359/bond-cni-plugin/0.log" Apr 23 13:55:02.319302 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:02.319282 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vwqk2_fe05d38e-d020-46dd-95d4-832fb5c93359/routeoverride-cni/0.log" Apr 23 13:55:02.338925 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:02.338905 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vwqk2_fe05d38e-d020-46dd-95d4-832fb5c93359/whereabouts-cni-bincopy/0.log" Apr 23 13:55:02.358893 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:02.358871 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vwqk2_fe05d38e-d020-46dd-95d4-832fb5c93359/whereabouts-cni/0.log" Apr 23 13:55:02.453620 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:02.453600 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ctn87_e9cbb1cc-dcfc-4dac-99b7-8363fbef7774/network-metrics-daemon/0.log" Apr 23 13:55:02.471486 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:02.471452 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ctn87_e9cbb1cc-dcfc-4dac-99b7-8363fbef7774/kube-rbac-proxy/0.log" Apr 23 13:55:03.260600 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:03.260569 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/ovn-controller/0.log" Apr 23 13:55:03.278046 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:03.278019 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/ovn-acl-logging/0.log" Apr 23 13:55:03.290943 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:03.290913 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/ovn-acl-logging/1.log" Apr 23 13:55:03.312481 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:03.312417 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/kube-rbac-proxy-node/0.log" Apr 23 13:55:03.333797 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:03.333772 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 13:55:03.350596 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:03.350571 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/northd/0.log" Apr 23 13:55:03.370054 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:03.370026 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/nbdb/0.log" Apr 23 13:55:03.390705 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:03.390664 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/sbdb/0.log" Apr 23 13:55:03.544121 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:03.544078 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4shpw_8eaf8674-35ae-40d6-b12b-07e254516721/ovnkube-controller/0.log" Apr 23 13:55:04.988404 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:04.988378 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-lwwp4_04f56a99-0f36-4cb1-bbb4-5f1009f833dc/check-endpoints/0.log" Apr 23 13:55:05.034246 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:05.034206 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-l79sj_774a8870-9d9e-4314-a059-b58aad91c605/network-check-target-container/0.log" Apr 23 13:55:06.013509 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:06.013483 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-msldf_50b7daad-20fe-4160-ba67-2e5371f39d68/iptables-alerter/0.log" Apr 23 13:55:06.627881 ip-10-0-129-103 kubenswrapper[2581]: I0423 13:55:06.627850 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-zjrsd_3b32a45a-dd10-48d7-9261-50a4c50f588a/tuned/0.log"